You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/07/24 13:29:36 UTC

Build failed in Jenkins: beam_PreCommit_Java_Cron #141

See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/141/display/redirect>

------------------------------------------
[...truncated 6.37 MB...]
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) switched from CREATED to SCHEDULED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) switched from CREATED to SCHEDULED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) switched from CREATED to SCHEDULED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) switched from CREATED to SCHEDULED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) switched from CREATED to SCHEDULED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) switched from CREATED to SCHEDULED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) switched from SCHEDULED to DEPLOYING.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (attempt #0) to localhost
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) switched from SCHEDULED to DEPLOYING.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (attempt #0) to localhost
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) switched from SCHEDULED to DEPLOYING.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (attempt #0) to localhost
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) switched from SCHEDULED to DEPLOYING.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (attempt #0) to localhost
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) switched from SCHEDULED to DEPLOYING.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (attempt #0) to localhost
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) switched from SCHEDULED to DEPLOYING.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (attempt #0) to localhost
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - Job 36a824611e77501ac3de2dc24d775bad was successfully submitted to the JobManager akka://flink/deadLetters.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	Job execution switched to status RUNNING.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem(1/1) switched to SCHEDULED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)(1/1) switched to SCHEDULED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)(1/1) switched to SCHEDULED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to SCHEDULED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map(1/1) switched to SCHEDULED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to SCHEDULED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)(1/1) switched to SCHEDULED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem(1/1) switched to DEPLOYING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)(1/1) switched to DEPLOYING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)(1/1) switched to DEPLOYING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to DEPLOYING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map(1/1) switched to DEPLOYING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to DEPLOYING 
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) switched from SCHEDULED to DEPLOYING.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (attempt #0) to localhost
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Received task Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) switched from CREATED to DEPLOYING.
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) [DEPLOYING]
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)(1/1) switched to DEPLOYING 
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Received task Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) switched from CREATED to DEPLOYING.
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) [DEPLOYING]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Received task TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) switched from CREATED to DEPLOYING.
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) [DEPLOYING]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Received task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) switched from CREATED to DEPLOYING.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) [DEPLOYING]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Received task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) switched from CREATED to DEPLOYING.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) [DEPLOYING]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Received task TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) switched from CREATED to DEPLOYING.
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) [DEPLOYING]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Received task TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) switched from CREATED to DEPLOYING.
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) [DEPLOYING]
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) [DEPLOYING].
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) [DEPLOYING].
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) [DEPLOYING].
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) [DEPLOYING].
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) switched from DEPLOYING to RUNNING.
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880)
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) switched from DEPLOYING to RUNNING.
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)(1/1) switched to RUNNING 
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) exceeded the 80 characters length limit and was truncated.
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) switched from DEPLOYING to RUNNING.
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) [DEPLOYING].
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880)
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) [DEPLOYING].
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) switched from DEPLOYING to RUNNING.
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880)
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper - No restore state for UnbounedSourceWrapper.
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper - Unbounded Flink Source 0/1 is reading from sources: [org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter@2d115a4d]
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) [DEPLOYING].
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) [DEPLOYING].
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) [DEPLOYING].
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) switched from DEPLOYING to RUNNING.
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880)
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) [DEPLOYING].
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) [DEPLOYING].
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) [DEPLOYING].
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) [DEPLOYING].
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) [DEPLOYING].
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) switched from DEPLOYING to RUNNING.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880)
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) switched from DEPLOYING to RUNNING.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) switched from DEPLOYING to RUNNING.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) switched from DEPLOYING to RUNNING.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) switched from DEPLOYING to RUNNING.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem(1/1) switched to RUNNING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)(1/1) switched to RUNNING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to RUNNING 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map(1/1) switched to RUNNING 
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing heap keyed state backend with stream factory.
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) switched from DEPLOYING to RUNNING.
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880)
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) exceeded the 80 characters length limit and was truncated.
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) switched from DEPLOYING to RUNNING.
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)(1/1) switched to RUNNING 
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) switched from DEPLOYING to RUNNING.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880)
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) switched from DEPLOYING to RUNNING.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:07	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to RUNNING 
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing heap keyed state backend with stream factory.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) exceeded the 80 characters length limit and was truncated.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) exceeded the 80 characters length limit and was truncated.
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper - No restore state for UnbounedSourceWrapper.
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper - Unbounded Flink Source 0/1 is reading from sources: [org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter@d5a388c]
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.sdk.io.WriteFiles - Opening writer ccceca3c-1f87-489f-a362-9872b8f80ea5 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@4d4ba8d0 pane PaneInfo.NO_FIRING destination null
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing heap keyed state backend with stream factory.
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) switched from RUNNING to FINISHED.
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe).
    [Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) [FINISHED]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Un-registering task and sending final execution state FINISHED to JobManager for task Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (b68fbed4541341b7908928190f3f64fe)
    [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource) (1/1) (b68fbed4541341b7908928190f3f64fe) switched from RUNNING to FINISHED.
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	Source: TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)(1/1) switched to FINISHED 
    [Time Trigger for Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.sdk.io.FileBasedSink$Writer - Successfully wrote temporary file /tmp/junit8461028423339166968/junit7775443102897849981/result/.temp-beam-2018-07-24_12-11-07-6/ccceca3c-1f87-489f-a362-9872b8f80ea5
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) switched from RUNNING to FINISHED.
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866).
    [Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) [FINISHED]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Un-registering task and sending final execution state FINISHED to JobManager for task Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (7799c912b2d53a8238b947f1a771c866)
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) switched from RUNNING to FINISHED.
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f).
    [TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) [FINISHED]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Un-registering task and sending final execution state FINISHED to JobManager for task TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (505e37b2a8fe119d912563baa81e223f)
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) switched from RUNNING to FINISHED.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e).
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) [FINISHED]
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskmanager.TaskManager - Un-registering task and sending final execution state FINISHED to JobManager for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (b3cfc35fad0a94f34dcbcb5481c9b02e)
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous) (1/1) (505e37b2a8fe119d912563baa81e223f) switched from RUNNING to FINISHED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b3cfc35fad0a94f34dcbcb5481c9b02e) switched from RUNNING to FINISHED.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to FINISHED 
    [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem (1/1) (7799c912b2d53a8238b947f1a771c866) switched from RUNNING to FINISHED.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles) -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)(1/1) switched to FINISHED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	Source: GenerateSequence/Read(BoundedCountingSource) -> ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/RewindowIntoGlobal/Window.Assign.out -> TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles -> ToKeyedWorkItem(1/1) switched to FINISHED 
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) switched from RUNNING to FINISHED.
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07).
    [TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) [FINISHED]
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskmanager.TaskManager - Un-registering task and sending final execution state FINISHED to JobManager for task TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (2093eb711bdef48a7ee795e6c41cfd07)
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.sdk.io.WriteFiles - Finalizing 1 file results
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.sdk.io.FileBasedSink - Finalizing for destination null num shards 1.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map (1/1) (2093eb711bdef48a7ee795e6c41cfd07) switched from RUNNING to FINISHED.
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.sdk.io.FileBasedSink - Will copy temporary file FileResult{tempFilename=/tmp/junit8461028423339166968/junit7775443102897849981/result/.temp-beam-2018-07-24_12-11-07-6/ccceca3c-1f87-489f-a362-9872b8f80ea5, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4d4ba8d0, paneInfo=PaneInfo.NO_FIRING} to final location /tmp/junit8461028423339166968/junit7775443102897849981/result/file.txt-00000-of-00001
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate) -> TextIO.Write/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous) -> Map(1/1) switched to FINISHED 
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.beam.sdk.io.FileBasedSink - Will remove known temporary file /tmp/junit8461028423339166968/junit7775443102897849981/result/.temp-beam-2018-07-24_12-11-07-6/ccceca3c-1f87-489f-a362-9872b8f80ea5
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) switched from RUNNING to FINISHED.
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8).
    [TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) [FINISHED]
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskmanager.TaskManager - Un-registering task and sending final execution state FINISHED to JobManager for task TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (b5070bdd71ecc5c7277e2d77642047b8)
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem (1/1) (b5070bdd71ecc5c7277e2d77642047b8) switched from RUNNING to FINISHED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign.out -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) -> ToKeyedWorkItem(1/1) switched to FINISHED 
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) switched from RUNNING to FINISHED.
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7).
    [TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) [FINISHED]
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskmanager.TaskManager - Un-registering task and sending final execution state FINISHED to JobManager for task TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (314cfafdcaae1b61d0bf9522430a2df7)
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous) (1/1) (314cfafdcaae1b61d0bf9522430a2df7) switched from RUNNING to FINISHED.
    [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) -> TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)(1/1) switched to FINISHED 
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job testflinkrunner-jenkins-0724121107-9a64b7c7 (36a824611e77501ac3de2dc24d775bad) switched from state RUNNING to FINISHED.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint coordinator for job 36a824611e77501ac3de2dc24d775bad.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore - Shutting down
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - 07/24/2018 12:11:08	Job execution switched to status FINISHED.
    [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - Terminate JobClientActor.
    [Test worker] INFO org.apache.flink.runtime.client.JobClient - Job execution complete
    [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.client.JobSubmissionClientActor - Disconnect from JobManager Actor[akka://flink/user/jobmanager_1#1511242515].
    [Test worker] INFO org.apache.beam.runners.flink.FlinkRunner - Execution finished in 322 msecs
    [Test worker] INFO org.apache.beam.runners.flink.FlinkRunner - Final accumulator values:
    [Test worker] INFO org.apache.beam.runners.flink.FlinkRunner - __metricscontainers : org.apache.beam.runners.core.metrics.MetricsContainerStepMap@9f980b54
    [Test worker] INFO org.apache.beam.runners.flink.ReadSourceStreamingTest - 
    --------------------------------------------------------------------------------
    Test testJob(org.apache.beam.runners.flink.ReadSourceStreamingTest) successfully run.
    ================================================================================

org.apache.beam.runners.flink.ReadSourceStreamingTest STANDARD_ERROR
    [Test worker] INFO org.apache.flink.runtime.minicluster.FlinkMiniCluster - Stopping FlinkMiniCluster.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskmanager.TaskManager - Stopping TaskManager akka://flink/user/taskmanager_1#-1413768013.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskmanager.TaskManager - Disassociating from JobManager
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmanager.JobManager - Stopping JobManager akka://flink/user/jobmanager_1.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the network environment and its components.
    [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:38287
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
    [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskmanager.TaskManager - Task manager akka://flink/user/taskmanager_1 is completely shut down.

org.apache.beam.runners.flink.PipelineOptionsTest > parDoBaseClassPipelineOptionsSerializationTest STANDARD_ERROR
    [Test worker] INFO org.apache.flink.api.java.typeutils.TypeExtractor - No fields were detected for class org.apache.beam.sdk.util.WindowedValue so it cannot be used as a POJO type and must be processed as GenericType. Please read the Flink documentation on "Data Types & Serialization" for details of the effect on performance.
    [Test worker] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager uses directory /tmp/flink-io-95e6631e-a738-4ab0-a4c0-3726108757cd for spill files.
    [Test worker] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed spill file directory /tmp/flink-io-95e6631e-a738-4ab0-a4c0-3726108757cd
Finished generating test XML results (5.67 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/flink/build/test-results/test>
Generating HTML test report...
Finished generating test html results (58.276 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/flink/build/reports/tests/test>
Packing task ':beam-runners-flink_2.11:test'
:beam-runners-flink_2.11:test (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 3 mins 50.575 secs.
:beam-runners-flink_2.11:check (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task :beam-runners-flink_2.11:check
Skipping task ':beam-runners-flink_2.11:check' as it has no actions.
:beam-runners-flink_2.11:check (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:beam-runners-flink_2.11:build (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task :beam-runners-flink_2.11:build
Skipping task ':beam-runners-flink_2.11:build' as it has no actions.
:beam-runners-flink_2.11:build (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.0 secs.
Expiring Daemon because JVM Tenured space is exhausted

> Task :beam-sdks-java-extensions-sql:compileJava FAILED
:beam-sdks-java-extensions-sql:compileJava (Thread[Task worker for ':' Thread 39,5,main]) completed. Took 16 mins 11.556 secs.
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
JVM garbage collector is thrashing. Daemon will be stopped immediately
Daemon is stopping immediately JVM garbage collector thrashing
Stop requested. Daemon is removing its presence from the registry...
Expiring Daemon because JVM Tenured space is exhausted

FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon has been stopped: JVM garbage collector thrashing

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Jenkins build is back to normal : beam_PreCommit_Java_Cron #149

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/149/display/redirect?page=changes>


Build failed in Jenkins: beam_PreCommit_Java_Cron #148

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/148/display/redirect?page=changes>

Changes:

[altay] Remove reference to dataflow-distribution.properties

[lcwik] [BEAM-4629] Output the names of the failing licenses as part of the

[aaltay] [BEAM-4859] Enable Python VR tests in streaming in postcommit task

------------------------------------------
[...truncated 17.58 MB...]
    INFO: 2018-07-26T06:18:06.230Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.279Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:364#1d275f544daf228c
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.320Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map, through flatten WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.394Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.440Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.477Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.529Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.577Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.625Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.677Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.725Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.759Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.792Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.824Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.869Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.904Z: Unzipping flatten s13-u58 for input s14.org.apache.beam.sdk.values.PCollection.<init>:364#f0cbc4d341b04049-c56
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.940Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign, through flatten s13-u58, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.985Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.017Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.059Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.112Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.160Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.207Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.243Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.282Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.326Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.367Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.414Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.451Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial into WordCount.CountWords/Count.PerElement/Init/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.485Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.529Z: Fusing consumer Window.Into()/Window.Assign into ParDo(AddTimestamp)
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.574Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.609Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.654Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.702Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.746Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.787Z: Fusing consumer MapElements/Map into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.821Z: Fusing consumer ParDo(AddTimestamp) into TextIO.Read/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.864Z: Fusing consumer WordCount.CountWords/ParDo(ExtractWords) into Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.912Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.956Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into MapElements/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.985Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.020Z: Fusing consumer WordCount.CountWords/Count.PerElement/Init/Map into WordCount.CountWords/ParDo(ExtractWords)
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.064Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.511Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.548Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.584Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.608Z: Starting 1 workers in us-central1-b...
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.618Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.646Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.917Z: Executing operation TextIO.Read/Read+ParDo(AddTimestamp)+Window.Into()/Window.Assign+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write
    Jul 26, 2018 6:18:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:20.196Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jul 26, 2018 6:18:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:30.700Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jul 26, 2018 6:18:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:30.729Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 26, 2018 6:18:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:53.945Z: Workers have started successfully.
    Jul 26, 2018 6:19:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:12.526Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
    Jul 26, 2018 6:19:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:12.609Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:27.399Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:27.502Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:28.617Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:28.712Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 26, 2018 6:19:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:34.182Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 26, 2018 6:19:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:34.262Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 26, 2018 6:19:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:44.106Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 26, 2018 6:19:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:44.187Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 26, 2018 6:19:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:48.459Z: Cleaning up.
    Jul 26, 2018 6:19:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:48.550Z: Stopping worker pool...
    Jul 26, 2018 6:22:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:22:04.367Z: Autoscaling: Resized worker pool from 1 to 0.
    Jul 26, 2018 6:22:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:22:04.432Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 26, 2018 6:22:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:22:04.478Z: Worker pool stopped.
    Jul 26, 2018 6:22:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-25_23_17_58-16190622390216562509 finished with status DONE.
    Jul 26, 2018 6:22:13 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-07-25_23_17_58-16190622390216562509. Found 0 success, 0 failures out of 0 expected assertions.
    Jul 26, 2018 6:22:15 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-25_23_17_58-16190622390216562509 finished with status DONE.

Gradle Test Executor 95 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java-examples:preCommit
Finished generating test XML results (0.004 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/test-results/preCommit>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/reports/tests/preCommit>
Packing task ':beam-runners-google-cloud-dataflow-java-examples:preCommit'
:beam-runners-google-cloud-dataflow-java-examples:preCommit (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 13 mins 34.323 secs.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 10,5,main]) started.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':',5,main]) started.

> Task :beam-examples-java:preCommit
Skipping task ':beam-examples-java:preCommit' as it has no actions.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java-examples:test NO-SOURCE
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:test' as it has no source files and no previous output files.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':',5,main]) completed. Took 0.001 secs.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':',5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:check
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:check' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':',5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:build
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:build' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':',5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.

> Task :beam-examples-java:buildDependents
Caching disabled for task ':beam-examples-java:buildDependents': Caching has not been enabled for the task
Task ':beam-examples-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.

> Task :beam-sdks-java-io-google-cloud-platform:buildDependents
Caching disabled for task ':beam-sdks-java-io-google-cloud-platform:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-io-google-cloud-platform:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-protobuf:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-protobuf:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 1s
648 actionable tasks: 643 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/jgqrge6og4bsy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #147

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/147/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-3709] Implementing new combine URNs in python.

[kedin] [SQL] Enable running BeamSqlLine from gradle

[lcwik] [BEAM-4866] Fix missing licenses.

[pablo] Removing scoped metrics container

[pablo] Remove old style metrics context management

[garrettjonesgoogle] Bumping versions that were missed in #5988

[lcwik] [BEAM-4176] Initial implementation for running portable runner tests

[pablo] Fix Java Nightly Snapshot Failures

------------------------------------------
[...truncated 11.55 MB...]
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[CARDINALITY($1)])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[CARDINALITY($t1)], EXPR$0=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testSelectRowsFromArrayOfRows STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_arrayOfRows`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_arrayOfRows=[$1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], f_arrayOfRows=[$t1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testUnnestLiteral STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `EXPR$0`.`EXPR$0`
    FROM UNNEST(ARRAY['a', 'b', 'c']) AS `EXPR$0`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[$0])
      Uncollect
        LogicalProject(EXPR$0=[ARRAY('a', 'b', 'c')])
          LogicalValues(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0=[{inputs}], EXPR$0=[$t0])
      BeamUncollectRel
        BeamCalcRel(expr#0=[{inputs}], expr#1=['a'], expr#2=['b'], expr#3=['c'], expr#4=[ARRAY($t1, $t2, $t3)], EXPR$0=[$t4])
          BeamValuesRel(tuples=[[{ 0 }]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testUnnestNamedLiteral STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `t`.`f_string`
    FROM UNNEST(ARRAY['a', 'b', 'c']) AS `t` (`f_string`)
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_string=[$0])
      Uncollect
        LogicalProject(EXPR$0=[ARRAY('a', 'b', 'c')])
          LogicalValues(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0=[{inputs}], f_string=[$t0])
      BeamUncollectRel
        BeamCalcRel(expr#0=[{inputs}], expr#1=['a'], expr#2=['b'], expr#3=['c'], expr#4=[ARRAY($t1, $t2, $t3)], EXPR$0=[$t4])
          BeamValuesRel(tuples=[[{ 0 }]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testSelectSingleRowFromArrayOfRows STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_arrayOfRows`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0$0=[ITEM($1, 1).f_rowString], EXPR$0$1=[ITEM($1, 1).f_rowInt])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[ITEM($t1, $t2)], expr#4=[$t3.f_rowString], expr#5=[$t3.f_rowInt], EXPR$0$0=[$t4], EXPR$0$1=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testProjectArrayField STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_int`, `PCOLLECTION`.`f_stringArr`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[$0], f_stringArr=[$1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], proj#0..1=[{exprs}])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayFieldAccess STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedArray=[$4])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], f_nestedArray=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorBraces STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowFieldAccess STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedString`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedString=[$2])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedString], f_nestedString=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayElementAccess STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[ITEM($4, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], expr#3=[1], expr#4=[ITEM($t2, $t3)], EXPR$0=[$t4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorKeyword STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

Finished generating test XML results (0.009 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/test-results/test>
Generating HTML test report...
Finished generating test html results (0.02 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/reports/tests/test>
Packing task ':beam-sdks-java-extensions-sql:test'
:beam-sdks-java-extensions-sql:test (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 46.522 secs.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-extensions-sql:check
Skipping task ':beam-sdks-java-extensions-sql:check' as it has no actions.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-extensions-sql:build
Skipping task ':beam-sdks-java-extensions-sql:build' as it has no actions.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-google-cloud-platform-core:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 11s
611 actionable tasks: 610 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/sd5tvnfvyhuua

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #146

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/146/display/redirect?page=changes>

Changes:

[kirpichov] Converts BoundedReadFromUnboundedSource to a DoFn

[kirpichov] Converts SolrIO away from BoundedSource

[thw] [BEAM-4842] Update Flink Runner to Flink 1.5.1

------------------------------------------
[...truncated 11.69 MB...]
    SELECT `PCOLLECTION`.`f_arrayOfRows`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0$0=[ITEM($1, 1).f_rowString], EXPR$0$1=[ITEM($1, 1).f_rowInt])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[ITEM($t1, $t2)], expr#4=[$t3.f_rowString], expr#5=[$t3.f_rowInt], EXPR$0$0=[$t4], EXPR$0$1=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testProjectArrayField STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_int`, `PCOLLECTION`.`f_stringArr`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[$0], f_stringArr=[$1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], proj#0..1=[{exprs}])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayFieldAccess STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedArray=[$4])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], f_nestedArray=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorBraces STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowFieldAccess STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedString`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedString=[$2])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedString], f_nestedString=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayElementAccess STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[ITEM($4, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], expr#3=[1], expr#4=[ITEM($t2, $t3)], EXPR$0=[$t4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorKeyword STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


Gradle Test Executor 114 finished executing tests.

> Task :beam-sdks-java-extensions-sql-jdbc:shadowJarTest
Build cache key for task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest' is 3f20f6965c3d49dd7aaea67fa8eea703
Task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest' is not up-to-date because:
  Task.upToDateWhen is false.
Not loading task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest' from cache because loading from cache is disabled for this task
Starting process 'Gradle Test Executor 116'. Working directory: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc> Command: /usr/local/asfpackages/java/jdk1.8.0_172/bin/java -Ddriver.jar=<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc/build/libs/beam-sdks-java-extensions-sql-jdbc-2.7.0-SNAPSHOT.jar> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/4.8/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 116'
Successfully started process 'Gradle Test Executor 116'

org.apache.beam.sdk.extensions.sql.jdbc.JdbcJarTest > classLoader_readFile STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
    INFO: Filepattern /tmp/junit4347623070925444386/junit5944966499373162478.tmp matched 1 files with total size 0
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.io.FileBasedSource split
    INFO: Splitting filepattern /tmp/junit4347623070925444386/junit5944966499373162478.tmp into bundles of size 0 took 1 ms and produced 1 files and 0 bundles

> Task :beam-sdks-java-extensions-sql:test
Finished generating test XML results (0.015 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/test-results/test>
Generating HTML test report...
Finished generating test html results (0.059 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/reports/tests/test>
Packing task ':beam-sdks-java-extensions-sql:test'
:beam-sdks-java-extensions-sql:test (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 43.74 secs.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-sdks-java-extensions-sql:check
Skipping task ':beam-sdks-java-extensions-sql:check' as it has no actions.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-sdks-java-extensions-sql:build
Skipping task ':beam-sdks-java-extensions-sql:build' as it has no actions.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
Gradle Test Executor 116 finished executing tests.

> Task :beam-sdks-java-extensions-sql-jdbc:shadowJarTest
Finished generating test XML results (0.0 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc/build/test-results/shadowJarTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc/build/reports/tests/shadowJarTest>
Packing task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest'
:beam-sdks-java-extensions-sql-jdbc:shadowJarTest (Thread[Task worker for ':',5,main]) completed. Took 7.415 secs.
:beam-sdks-java-extensions-sql-jdbc:preCommit (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-java-extensions-sql-jdbc:preCommit
Skipping task ':beam-sdks-java-extensions-sql-jdbc:preCommit' as it has no actions.
:beam-sdks-java-extensions-sql-jdbc:preCommit (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 7 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':rat'.
> Found 5 files with unapproved/unknown licenses. See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/build/reports/rat/rat-report.txt>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-google-cloud-platform-core:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-5:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

6: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-2:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

7: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-jms:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/io/jms/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 4s
600 actionable tasks: 597 executed, 3 from cache

Publishing build scan...
https://gradle.com/s/7kgqlrrwmc2pq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #145

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/145/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4838] Add dockerfile for standalone Jenkins. Plugins included.

------------------------------------------
[...truncated 14.99 MB...]
                  LogicalProject(auction=[$0], $f1=[HOP($3, 5000, 10000)])
                    BeamIOSourceRel(table=[[beam, Bid]])

    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], proj#0..1=[{exprs}])
      BeamJoinRel(condition=[AND(=($2, $4), >=($1, $3))], joinType=[inner])
        BeamCalcRel(expr#0..2=[{inputs}], auction=[$t0], num=[$t2], starttime=[$t1])
          BeamAggregationRel(group=[{0, 1}], num=[COUNT()])
            BeamCalcRel(expr#0..4=[{inputs}], expr#5=[5000], expr#6=[10000], expr#7=[HOP($t3, $t5, $t6)], auction=[$t0], $f1=[$t7])
              BeamIOSourceRel(table=[[beam, Bid]])
        BeamCalcRel(expr#0..1=[{inputs}], maxnum=[$t1], starttime=[$t0])
          BeamAggregationRel(group=[{1}], maxnum=[MAX($0)])
            BeamCalcRel(expr#0..2=[{inputs}], num=[$t2], starttime=[$t1])
              BeamAggregationRel(group=[{0, 1}], num=[COUNT()])
                BeamCalcRel(expr#0..4=[{inputs}], expr#5=[5000], expr#6=[10000], expr#7=[HOP($t3, $t5, $t6)], auction=[$t0], $f1=[$t7])
                  BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery3Test > testJoinsPeopleWithAuctions STANDARD_ERROR
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `P`.`name`, `P`.`city`, `P`.`state`, `A`.`id`
    FROM `beam`.`Auction` AS `A`
    INNER JOIN `beam`.`Person` AS `P` ON `A`.`seller` = `P`.`id`
    WHERE `A`.`category` = 10 AND (`P`.`state` = 'OR' OR `P`.`state` = 'ID' OR `P`.`state` = 'CA')
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(name=[$11], city=[$14], state=[$15], id=[$0])
      LogicalFilter(condition=[AND(=($8, 10), OR(=($15, 'OR'), =($15, 'ID'), =($15, 'CA')))])
        LogicalJoin(condition=[=($7, $10)], joinType=[inner])
          BeamIOSourceRel(table=[[beam, Auction]])
          BeamIOSourceRel(table=[[beam, Person]])

    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..17=[{inputs}], name=[$t11], city=[$t14], state=[$t15], id=[$t0])
      BeamJoinRel(condition=[=($7, $10)], joinType=[inner])
        BeamCalcRel(expr#0..9=[{inputs}], expr#10=[10], expr#11=[=($t8, $t10)], proj#0..9=[{exprs}], $condition=[$t11])
          BeamIOSourceRel(table=[[beam, Auction]])
        BeamCalcRel(expr#0..7=[{inputs}], expr#8=['OR'], expr#9=[=($t5, $t8)], expr#10=['ID'], expr#11=[=($t5, $t10)], expr#12=['CA'], expr#13=[=($t5, $t12)], expr#14=[OR($t9, $t11, $t13)], proj#0..7=[{exprs}], $condition=[$t14])
          BeamIOSourceRel(table=[[beam, Person]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery7Test > testBids STANDARD_ERROR
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`
    FROM (SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`, TUMBLE_START(`B`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
    FROM `beam`.`Bid` AS `B`
    GROUP BY `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`, TUMBLE(`B`.`dateTime`, INTERVAL '10' SECOND)) AS `B`
    INNER JOIN (SELECT MAX(`B1`.`price`) AS `maxprice`, TUMBLE_START(`B1`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
    FROM `beam`.`Bid` AS `B1`
    GROUP BY TUMBLE(`B1`.`dateTime`, INTERVAL '10' SECOND)) AS `B1` ON `B`.`starttime` = `B1`.`starttime` AND `B`.`price` = `B1`.`maxprice`
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], extra=[$4])
      LogicalJoin(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
        LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], extra=[$4], starttime=[$5])
          LogicalAggregate(group=[{0, 1, 2, 3, 4, 5}])
            LogicalProject(auction=[$0], price=[$2], bidder=[$1], dateTime=[$3], extra=[$4], $f5=[TUMBLE($3, 10000)])
              BeamIOSourceRel(table=[[beam, Bid]])
        LogicalProject(maxprice=[$1], starttime=[$0])
          LogicalAggregate(group=[{0}], maxprice=[MAX($1)])
            LogicalProject($f0=[TUMBLE($3, 10000)], price=[$2])
              BeamIOSourceRel(table=[[beam, Bid]])

    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..7=[{inputs}], proj#0..4=[{exprs}])
      BeamJoinRel(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
        BeamCalcRel(expr#0..5=[{inputs}], proj#0..5=[{exprs}])
          BeamAggregationRel(group=[{0, 1, 2, 3, 4, 5}])
            BeamCalcRel(expr#0..4=[{inputs}], expr#5=[10000], expr#6=[TUMBLE($t3, $t5)], auction=[$t0], price=[$t2], bidder=[$t1], dateTime=[$t3], extra=[$t4], $f5=[$t6])
              BeamIOSourceRel(table=[[beam, Bid]])
        BeamCalcRel(expr#0..1=[{inputs}], maxprice=[$t1], starttime=[$t0])
          BeamAggregationRel(group=[{0}], maxprice=[MAX($1)])
            BeamCalcRel(expr#0..4=[{inputs}], expr#5=[10000], expr#6=[TUMBLE($t3, $t5)], $f0=[$t6], price=[$t2])
              BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery2Test > testSkipsEverySecondElement STANDARD_ERROR
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`auction`, `PCOLLECTION`.`bidder`, `PCOLLECTION`.`price`, `PCOLLECTION`.`dateTime`, `PCOLLECTION`.`extra`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    WHERE MOD(`PCOLLECTION`.`auction`, 2) = 0
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], bidder=[$1], price=[$2], dateTime=[$3], extra=[$4])
      LogicalFilter(condition=[=(MOD($0, 2), 0)])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], expr#5=[2], expr#6=[MOD($t0, $t5)], expr#7=[0], expr#8=[=($t6, $t7)], proj#0..4=[{exprs}], $condition=[$t8])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery2Test > testSkipsEveryThirdElement STANDARD_ERROR
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`auction`, `PCOLLECTION`.`bidder`, `PCOLLECTION`.`price`, `PCOLLECTION`.`dateTime`, `PCOLLECTION`.`extra`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    WHERE MOD(`PCOLLECTION`.`auction`, 3) = 0
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], bidder=[$1], price=[$2], dateTime=[$3], extra=[$4])
      LogicalFilter(condition=[=(MOD($0, 3), 0)])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], expr#5=[3], expr#6=[MOD($t0, $t5)], expr#7=[0], expr#8=[=($t6, $t7)], proj#0..4=[{exprs}], $condition=[$t8])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery1Test > testConvertsPriceToEur STANDARD_ERROR
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`auction`, `PCOLLECTION`.`bidder`, `DolToEur`(`PCOLLECTION`.`price`) AS `price`, `PCOLLECTION`.`dateTime`, `PCOLLECTION`.`extra`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], bidder=[$1], price=[DolToEur($2)], dateTime=[$3], extra=[$4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], expr#5=[DolToEur($t2)], proj#0..1=[{exprs}], price=[$t5], dateTime=[$t3], extra=[$t4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


Gradle Test Executor 120 finished executing tests.

> Task :beam-sdks-java-nexmark:test
Finished generating test XML results (0.001 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/nexmark/build/test-results/test>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/nexmark/build/reports/tests/test>
Packing task ':beam-sdks-java-nexmark:test'
:beam-sdks-java-nexmark:test (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 1 mins 27.349 secs.
:beam-sdks-java-nexmark:check (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-nexmark:check
Skipping task ':beam-sdks-java-nexmark:check' as it has no actions.
:beam-sdks-java-nexmark:check (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-nexmark:build (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-nexmark:build
Skipping task ':beam-sdks-java-nexmark:build' as it has no actions.
:beam-sdks-java-nexmark:build (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-nexmark:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-nexmark:buildDependents
Caching disabled for task ':beam-sdks-java-nexmark:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-nexmark:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-nexmark:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-sql:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.
:beam-sdks-java-io-kafka:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-sdks-java-extensions-sql:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-sql:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-sql:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-sql:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-join-library:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-io-kafka:buildDependents
Caching disabled for task ':beam-sdks-java-io-kafka:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-io-kafka:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-kafka:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.

> Task :beam-sdks-java-extensions-join-library:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-join-library:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-join-library:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-join-library:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':rat'.
> Found 5 files with unapproved/unknown licenses. See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/build/reports/rat/rat-report.txt>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-examples-java:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-jms:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/io/jms/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 26s
632 actionable tasks: 627 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/2slrga7k32x5y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #144

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/144/display/redirect?page=changes>

Changes:

[thw] [BEAM-4847] Reduce Gradle JVM Xmx to 4g to fix Jenkins build failures.

[aaltay] Automate 'Start a snapshot build' step in beam release guide (#6042)

------------------------------------------
[...truncated 17.85 MB...]
    INFO: 2018-07-25T06:17:52.537Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:52.933Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:52.984Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:364#1d275f544daf228c
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.043Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map, through flatten WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.099Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.142Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.190Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.231Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.283Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.317Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.359Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.406Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.454Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.498Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.543Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.596Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.647Z: Unzipping flatten s13-u58 for input s14.org.apache.beam.sdk.values.PCollection.<init>:364#f0cbc4d341b04049-c56
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.695Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign, through flatten s13-u58, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.742Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.787Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.846Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.900Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.943Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:53.984Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.033Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.073Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.120Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.169Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.212Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.261Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial into WordCount.CountWords/Count.PerElement/Init/Map
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.307Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.359Z: Fusing consumer Window.Into()/Window.Assign into ParDo(AddTimestamp)
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.404Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.455Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.494Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.548Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.597Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.644Z: Fusing consumer MapElements/Map into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.691Z: Fusing consumer ParDo(AddTimestamp) into TextIO.Read/Read
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.740Z: Fusing consumer WordCount.CountWords/ParDo(ExtractWords) into Window.Into()/Window.Assign
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.789Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.830Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into MapElements/Map
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.881Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.923Z: Fusing consumer WordCount.CountWords/Count.PerElement/Init/Map into WordCount.CountWords/ParDo(ExtractWords)
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:54.975Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:55.548Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:55.599Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:55.658Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:55.677Z: Starting 1 workers in us-central1-b...
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:55.707Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:55.748Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Create
    Jul 25, 2018 6:18:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:17:56.135Z: Executing operation TextIO.Read/Read+ParDo(AddTimestamp)+Window.Into()/Window.Assign+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write
    Jul 25, 2018 6:18:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:18:07.537Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jul 25, 2018 6:18:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:18:18.041Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jul 25, 2018 6:18:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:18:18.069Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 25, 2018 6:18:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:18:34.654Z: Workers have started successfully.
    Jul 25, 2018 6:18:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:18:53.057Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
    Jul 25, 2018 6:18:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:18:53.156Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
    Jul 25, 2018 6:19:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:05.353Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
    Jul 25, 2018 6:19:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:05.440Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
    Jul 25, 2018 6:19:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:07.589Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
    Jul 25, 2018 6:19:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:07.690Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 25, 2018 6:19:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:13.149Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 25, 2018 6:19:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:13.239Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 25, 2018 6:19:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:22.733Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 25, 2018 6:19:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:22.830Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 25, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:28.185Z: Cleaning up.
    Jul 25, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:19:28.294Z: Stopping worker pool...
    Jul 25, 2018 6:21:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:21:38.370Z: Autoscaling: Resized worker pool from 1 to 0.
    Jul 25, 2018 6:21:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T06:21:38.412Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 25, 2018 6:21:47 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_23_17_44-1656338309911880065 finished with status DONE.
    Jul 25, 2018 6:21:47 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-07-24_23_17_44-1656338309911880065. Found 0 success, 0 failures out of 0 expected assertions.
    Jul 25, 2018 6:21:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_23_17_44-1656338309911880065 finished with status DONE.

Gradle Test Executor 97 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java-examples:preCommit
Finished generating test XML results (0.004 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/test-results/preCommit>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/reports/tests/preCommit>
Packing task ':beam-runners-google-cloud-dataflow-java-examples:preCommit'
:beam-runners-google-cloud-dataflow-java-examples:preCommit (Thread[Task worker for ':' Thread 6,5,main]) completed. Took 12 mins 57.443 secs.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 6,5,main]) started.

> Task :beam-examples-java:preCommit
Skipping task ':beam-examples-java:preCommit' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':' Thread 2,5,main]) started.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java-examples:test NO-SOURCE
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:test' as it has no source files and no previous output files.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 0.001 secs.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':' Thread 2,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:check
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:check' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':' Thread 2,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:build
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:build' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:beam-examples-java:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) started.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':',5,main]) started.

> Task :beam-examples-java:buildDependents
Caching disabled for task ':beam-examples-java:buildDependents': Caching has not been enabled for the task
Task ':beam-examples-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-examples-java:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-java-io-google-cloud-platform:buildDependents
Caching disabled for task ':beam-sdks-java-io-google-cloud-platform:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-io-google-cloud-platform:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-protobuf:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-protobuf:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 21m 12s
648 actionable tasks: 643 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/hlci3qjdvngpm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #143

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/143/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4845] Make BigQueryServices and FakeDatasetService public to fix

[amyrvold] Fix [BEAM-4847] by manually setting max workers and jvm memory and add

[relax] Enable schemas for Apex runner.

[relax] Enable Schemas on samza runner.

[relax] Enable schemas for Flink runner.

[relax] Enable schemas for Spark.

[relax] Fix bugs.

[relax] Fix Apex breakage.

[ehudm] Remove CODEOWNERs.

------------------------------------------
[...truncated 16.25 MB...]
    INFO: 2018-07-25T00:19:17.462Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.503Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.551Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.589Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.641Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.688Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.734Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.780Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.817Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.862Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.913Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.957Z: Unzipping flatten s13-u58 for input s14.org.apache.beam.sdk.values.PCollection.<init>:364#f0cbc4d341b04049-c56
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:17.994Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign, through flatten s13-u58, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.041Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.088Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.130Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.179Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.224Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.275Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.330Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.369Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.423Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.466Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.512Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.545Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial into WordCount.CountWords/Count.PerElement/Init/Map
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.587Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.638Z: Fusing consumer Window.Into()/Window.Assign into ParDo(AddTimestamp)
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.686Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.729Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.776Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.826Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.879Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.926Z: Fusing consumer MapElements/Map into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:18.966Z: Fusing consumer ParDo(AddTimestamp) into TextIO.Read/Read
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.011Z: Fusing consumer WordCount.CountWords/ParDo(ExtractWords) into Window.Into()/Window.Assign
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.057Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.106Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into MapElements/Map
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.158Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.202Z: Fusing consumer WordCount.CountWords/Count.PerElement/Init/Map into WordCount.CountWords/ParDo(ExtractWords)
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.251Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.854Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.900Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.950Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 25, 2018 12:19:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.965Z: Starting 1 workers in us-central1-f...
    Jul 25, 2018 12:19:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:19.991Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 25, 2018 12:19:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:20.045Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Create
    Jul 25, 2018 12:19:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:20.392Z: Executing operation TextIO.Read/Read+ParDo(AddTimestamp)+Window.Into()/Window.Assign+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write
    Jul 25, 2018 12:19:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:27.125Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jul 25, 2018 12:19:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:37.557Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jul 25, 2018 12:19:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:37.606Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 25, 2018 12:19:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:19:52.748Z: Workers have started successfully.
    Jul 25, 2018 12:20:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:11.511Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
    Jul 25, 2018 12:20:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:11.628Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
    Jul 25, 2018 12:20:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:23.834Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
    Jul 25, 2018 12:20:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:23.944Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
    Jul 25, 2018 12:20:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:25.010Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
    Jul 25, 2018 12:20:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:25.109Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 25, 2018 12:20:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:31.484Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 25, 2018 12:20:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:31.585Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 25, 2018 12:20:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:38.972Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 25, 2018 12:20:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:39.123Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 25, 2018 12:20:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:43.521Z: Cleaning up.
    Jul 25, 2018 12:20:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:20:43.674Z: Stopping worker pool...
    Jul 25, 2018 12:22:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:22:18.657Z: Autoscaling: Resized worker pool from 1 to 0.
    Jul 25, 2018 12:22:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T00:22:18.704Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 25, 2018 12:22:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_17_19_08-13209771397614533617 finished with status DONE.
    Jul 25, 2018 12:22:28 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-07-24_17_19_08-13209771397614533617. Found 0 success, 0 failures out of 0 expected assertions.
    Jul 25, 2018 12:22:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_17_19_08-13209771397614533617 finished with status DONE.

Gradle Test Executor 97 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java-examples:preCommit
Finished generating test XML results (0.004 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/test-results/preCommit>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/reports/tests/preCommit>
Packing task ':beam-runners-google-cloud-dataflow-java-examples:preCommit'
:beam-runners-google-cloud-dataflow-java-examples:preCommit (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 13 mins 51.961 secs.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 11,5,main]) started.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':' Thread 8,5,main]) started.

> Task :beam-examples-java:preCommit
Skipping task ':beam-examples-java:preCommit' as it has no actions.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java-examples:test NO-SOURCE
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:test' as it has no source files and no previous output files.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':' Thread 8,5,main]) completed. Took 0.002 secs.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':' Thread 8,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:check
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:check' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':' Thread 8,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:build
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:build' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':',5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.001 secs.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-examples-java:buildDependents
Caching disabled for task ':beam-examples-java:buildDependents': Caching has not been enabled for the task
Task ':beam-examples-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-sdks-java-io-google-cloud-platform:buildDependents
Caching disabled for task ':beam-sdks-java-io-google-cloud-platform:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-io-google-cloud-platform:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-protobuf:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-protobuf:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-kafka:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/io/kafka/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 19s
647 actionable tasks: 642 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/ef7h3mka5i23s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #142

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/142/display/redirect?page=changes>

Changes:

[github] Fixing log message

------------------------------------------
[...truncated 16.45 MB...]
    INFO: 2018-07-24T18:19:05.731Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.761Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.789Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.823Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.846Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.870Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.891Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.918Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.943Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:05.974Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.002Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.025Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.049Z: Unzipping flatten s13-u58 for input s14.org.apache.beam.sdk.values.PCollection.<init>:364#f0cbc4d341b04049-c56
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.083Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign, through flatten s13-u58, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.115Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.140Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.162Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.188Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.212Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.243Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.266Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.287Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.312Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.337Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.365Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.395Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial into WordCount.CountWords/Count.PerElement/Init/Map
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.432Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.451Z: Fusing consumer Window.Into()/Window.Assign into ParDo(AddTimestamp)
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.476Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.500Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.525Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.559Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.591Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.622Z: Fusing consumer MapElements/Map into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.643Z: Fusing consumer ParDo(AddTimestamp) into TextIO.Read/Read
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.670Z: Fusing consumer WordCount.CountWords/ParDo(ExtractWords) into Window.Into()/Window.Assign
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.691Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.717Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into MapElements/Map
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.745Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.775Z: Fusing consumer WordCount.CountWords/Count.PerElement/Init/Map into WordCount.CountWords/ParDo(ExtractWords)
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:06.805Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:07.147Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:07.178Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:07.205Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:07.213Z: Starting 1 workers in us-central1-b...
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:07.229Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:07.256Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Create
    Jul 24, 2018 6:19:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:07.534Z: Executing operation TextIO.Read/Read+ParDo(AddTimestamp)+Window.Into()/Window.Assign+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write
    Jul 24, 2018 6:19:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:18.600Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jul 24, 2018 6:19:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:29.162Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jul 24, 2018 6:19:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:19:29.209Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 24, 2018 6:20:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:01.122Z: Workers have started successfully.
    Jul 24, 2018 6:20:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:38.780Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
    Jul 24, 2018 6:20:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:38.867Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
    Jul 24, 2018 6:20:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:51.203Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
    Jul 24, 2018 6:20:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:51.282Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
    Jul 24, 2018 6:20:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:53.403Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
    Jul 24, 2018 6:20:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:53.486Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 24, 2018 6:21:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:20:59.957Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 24, 2018 6:21:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:21:00.060Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 24, 2018 6:21:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:21:09.418Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 24, 2018 6:21:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:21:09.512Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 24, 2018 6:21:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:21:14.996Z: Cleaning up.
    Jul 24, 2018 6:21:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:21:15.126Z: Stopping worker pool...
    Jul 24, 2018 6:22:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:22:49.011Z: Autoscaling: Resized worker pool from 1 to 0.
    Jul 24, 2018 6:22:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:22:49.053Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 24, 2018 6:22:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:22:49.119Z: Worker pool stopped.
    Jul 24, 2018 6:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_11_18_57-15976176805706694236 finished with status DONE.
    Jul 24, 2018 6:22:58 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-07-24_11_18_57-15976176805706694236. Found 0 success, 0 failures out of 0 expected assertions.
    Jul 24, 2018 6:23:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_11_18_57-15976176805706694236 finished with status DONE.

Gradle Test Executor 97 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java-examples:preCommit
Finished generating test XML results (0.004 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/test-results/preCommit>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/reports/tests/preCommit>
Packing task ':beam-runners-google-cloud-dataflow-java-examples:preCommit'
:beam-runners-google-cloud-dataflow-java-examples:preCommit (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 14 mins 11.684 secs.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 11,5,main]) started.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':' Thread 5,5,main]) started.

> Task :beam-examples-java:preCommit
Skipping task ':beam-examples-java:preCommit' as it has no actions.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java-examples:test NO-SOURCE
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:test' as it has no source files and no previous output files.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 0.001 secs.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':' Thread 5,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:check
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:check' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':' Thread 5,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:build
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:build' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':' Thread 5,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.
:beam-examples-java:buildDependents (Thread[Task worker for ':' Thread 5,5,main]) started.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 7,5,main]) started.

> Task :beam-examples-java:buildDependents
Caching disabled for task ':beam-examples-java:buildDependents': Caching has not been enabled for the task
Task ':beam-examples-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-examples-java:buildDependents (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 7,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-2:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/io/elasticsearch-tests/elasticsearch-tests-2/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:compileJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 32s
632 actionable tasks: 627 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/bqz3fbps6qjee

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure