You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/12/13 16:32:25 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #1217

See <https://builds.apache.org/job/beam_PostCommit_Python2/1217/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8947] Change Flink docker image names in Jenkins jobs to match


------------------------------------------
[...truncated 1.42 MB...]
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-b85964c6-736b-4fd9-807c-f49612f7d37e
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-ec1d788f-f1f2-41c2-ba9d-24badb299514
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:42707
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 65606 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=765, count=36, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=668, count=33, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=636, count=33, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=773, count=33, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=612, count=37, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=724, count=20, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=652, count=35, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempgsu5jM/artifactsXM6BZF/job_5d8b72f8-68c1-4d11-b138-5c733c2d56d6/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempgsu5jM/artifactsXM6BZF/job_5d8b72f8-68c1-4d11-b138-5c733c2d56d6/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1213154500-a89a7257_44b7d6f3-bafe-4bd1-b5c0-a0c473924441
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1213154500-a89a7257_44b7d6f3-bafe-4bd1-b5c0-a0c473924441
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/13 15:46:14 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:46585
19/12/13 15:46:14 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:45649
19/12/13 15:46:14 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:53765
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/13 15:46:15 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1213154615-95ce1aad_1609d829-44bd-424f-baba-5152b1f5d91c
19/12/13 15:46:16 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1213154615-95ce1aad_1609d829-44bd-424f-baba-5152b1f5d91c
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/13 15:46:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/13 15:46:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/13 15:46:16 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/13 15:46:16 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/13 15:46:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1213154615-95ce1aad_1609d829-44bd-424f-baba-5152b1f5d91c on Spark master local[4]
19/12/13 15:46:17 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/13 15:46:17 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:40855.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/13 15:46:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43423.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:45307
19/12/13 15:46:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/13 15:46:20 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/13 15:46:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213154615-95ce1aad_1609d829-44bd-424f-baba-5152b1f5d91c: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/13 15:47:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213154615-95ce1aad_1609d829-44bd-424f-baba-5152b1f5d91c finished.
19/12/13 15:47:25 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/13 15:47:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempSO7w9e/artifactsiEoFH9/job_564f8e2d-19e9-4dfb-9fec-1f7e9b33f79f/MANIFEST has 1 artifact locations
19/12/13 15:47:25 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempSO7w9e/artifactsiEoFH9/job_564f8e2d-19e9-4dfb-9fec-1f7e9b33f79f/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/13 15:47:25 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1213154615-95ce1aad_1609d829-44bd-424f-baba-5152b1f5d91c
19/12/13 15:47:25 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1213154615-95ce1aad_1609d829-44bd-424f-baba-5152b1f5d91c
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576252046.333130820","description":"Error received from peer ipv4:127.0.0.1:43423","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576252046.333166865","description":"Error received from peer ipv4:127.0.0.1:40855","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576252046.333092277","description":"Error received from peer ipv4:127.0.0.1:45307","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576252046.333092277","description":"Error received from peer ipv4:127.0.0.1:45307","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3356.653s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 12s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/wdqbbzolhh3v2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #1247

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1247/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1246

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1246/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-1440] Provide functions for waiting for BQ job and exporting

[kamil.wasilewski] [BEAM-1440] Create _BigQuerySource that implements iobase.BoundedSource

[kamil.wasilewski] [BEAM-1440] Reorganised BigQuery read IT tests

[kamil.wasilewski] [BEAM-1440] Create postCommitIT jobs running on Flink Runner

[kamil.wasilewski] [BEAM-1440] Convert strings to bytes on Python 3 if field type is BYTES

[kamil.wasilewski] [BEAM-1440]: Support RECORD fields in coder

[kamil.wasilewski] [BEAM-1440] Remove json files after reading

[kamil.wasilewski] [BEAM-1440] Marked classes as private

[kamil.wasilewski] [BEAM-1440] Do not force to create temp dataset when using dry run


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:37063
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 35626 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=708, count=33, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=685, count=34, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=677, count=35, min=17, max=26}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=924, count=39, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=589, count=35, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=757, count=20, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=656, count=36, min=16, max=22}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp_NQS9t/artifacts2a1YlZ/job_701cbb1a-502f-45d5-9f52-2948355553ef/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp_NQS9t/artifacts2a1YlZ/job_701cbb1a-502f-45d5-9f52-2948355553ef/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1217215615-c855d03a_a7b35188-3cfe-44e5-af99-ae0d12ddd98e
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1217215615-c855d03a_a7b35188-3cfe-44e5-af99-ae0d12ddd98e
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/17 21:56:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:36373
19/12/17 21:56:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:43241
19/12/17 21:56:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:48817
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/17 21:57:01 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1217215701-b71b9c2e_5d62ea2e-327e-4f12-9449-67f1ca6687c9
19/12/17 21:57:01 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1217215701-b71b9c2e_5d62ea2e-327e-4f12-9449-67f1ca6687c9
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/17 21:57:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/17 21:57:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/17 21:57:02 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/17 21:57:03 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/17 21:57:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1217215701-b71b9c2e_5d62ea2e-327e-4f12-9449-67f1ca6687c9 on Spark master local[4]
19/12/17 21:57:03 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/17 21:57:03 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37021.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/17 21:57:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:44741.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36931
19/12/17 21:57:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/17 21:57:06 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/17 21:57:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217215701-b71b9c2e_5d62ea2e-327e-4f12-9449-67f1ca6687c9: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/17 21:57:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217215701-b71b9c2e_5d62ea2e-327e-4f12-9449-67f1ca6687c9 finished.
19/12/17 21:57:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/17 21:57:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp3OnyDO/artifactsQ4XMsg/job_f3b42ca9-21f6-46dc-bd65-9c9372d8c55a/MANIFEST has 1 artifact locations
19/12/17 21:57:42 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp3OnyDO/artifactsQ4XMsg/job_f3b42ca9-21f6-46dc-bd65-9c9372d8c55a/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/17 21:57:42 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1217215701-b71b9c2e_5d62ea2e-327e-4f12-9449-67f1ca6687c9
19/12/17 21:57:42 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1217215701-b71b9c2e_5d62ea2e-327e-4f12-9449-67f1ca6687c9
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576619863.032590509","description":"Error received from peer ipv4:127.0.0.1:44741","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576619863.032566586","description":"Error received from peer ipv4:127.0.0.1:36931","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576619863.032566586","description":"Error received from peer ipv4:127.0.0.1:36931","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576619863.032592725","description":"Error received from peer ipv4:127.0.0.1:37021","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:769: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:261: FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1593: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:155: FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1593: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:755: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1409: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1409: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:769: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1406: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 47 tests in 3387.928s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 82

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 49s
120 actionable tasks: 97 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/yhr6nemnp6skk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1245

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1245/display/redirect?page=changes>

Changes:

[relax] Merge pull request #10311: [BEAM-8810] Detect stuck commits in


------------------------------------------
[...truncated 1.49 MB...]
19/12/17 20:32:37 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1217203237-c4cbb271_7eaa3f23-2149-4850-8c7d-c7bdcfd94219
19/12/17 20:32:37 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1217203237-c4cbb271_7eaa3f23-2149-4850-8c7d-c7bdcfd94219
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/17 20:32:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/17 20:32:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/17 20:32:38 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/17 20:32:39 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/17 20:32:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1217203237-c4cbb271_7eaa3f23-2149-4850-8c7d-c7bdcfd94219 on Spark master local[4]
19/12/17 20:32:39 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/17 20:32:39 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:32845.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/17 20:32:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:33931.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:41763
19/12/17 20:32:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/17 20:32:42 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/17 20:33:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217203237-c4cbb271_7eaa3f23-2149-4850-8c7d-c7bdcfd94219: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/17 20:33:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217203237-c4cbb271_7eaa3f23-2149-4850-8c7d-c7bdcfd94219 finished.
19/12/17 20:33:43 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/17 20:33:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempURi4tx/artifactsFv2kZ6/job_5cb7b835-aa4e-4a2a-824b-0a028475d7c1/MANIFEST has 1 artifact locations
19/12/17 20:33:43 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempURi4tx/artifactsFv2kZ6/job_5cb7b835-aa4e-4a2a-824b-0a028475d7c1/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/17 20:33:43 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1217203237-c4cbb271_7eaa3f23-2149-4850-8c7d-c7bdcfd94219
19/12/17 20:33:43 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1217203237-c4cbb271_7eaa3f23-2149-4850-8c7d-c7bdcfd94219
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576614824.280329315","description":"Error received from peer ipv4:127.0.0.1:33931","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576614824.280883746","description":"Error received from peer ipv4:127.0.0.1:32845","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576614824.280330119","description":"Error received from peer ipv4:127.0.0.1:41763","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576614824.280330119","description":"Error received from peer ipv4:127.0.0.1:41763","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3319.504s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_21_57-18097809230977229113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_29_20-18235151910325636623?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_37_05-669192379095939170?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_43_37-14731641073491818512?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_49_45-3143446595846296752?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_56_29-16732987321626238176?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_13_03_33-16016390684718383099?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_13_09_52-5136773241108623584?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_21_59-4486105337145652476?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_36_18-4779355395020712325?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_44_03-12784615440698309805?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_51_31-5666288062418709563?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_58_27-1811685044955375706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_22_01-718627198555977489?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_34_14-17339624575759914198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_40_14-7045374201631295623?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_46_27-1092005231830528042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_52_33-7499294439442284946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_21_56-7646675489892582927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_38_52-1749834385877085626?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_55_58-8595794603260757738?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_21_55-17456748637205018407?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_38_43-4623077574425911609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_45_35-7060511507969631960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_52_53-1515327047896418909?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_22_01-1405493834964191387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_28_18-952952631143277930?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_36_53-13343467883653141276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_43_18-9780235665161289421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_49_38-14784506671743611260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_21_57-6055359361951331339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_29_13-4433071081007834004?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_36_45-14312834031844844530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_44_34-10659622174174760365?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_51_19-14577601055582906103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_21_57-7326085078176373951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_30_31-5095712196083720131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_40_08-13786163082270310251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_46_50-2357018402667898215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_12_52_46-16996801468464091260?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 57s
120 actionable tasks: 95 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/s4yduoepp3fk4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1244

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1244/display/redirect?page=changes>

Changes:

[boyuanz] [BEAM-8536] Migrate using requested_execution_time to

[daniel.o.programmer] [BEAM-7970] Touch-up on Go protobuf generation instructions.


------------------------------------------
[...truncated 1.41 MB...]
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-19c465ea-d741-4d3e-8f3f-7bb9ae59f42f
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-607f3a97-941b-4470-b4b4-7fae97084d51
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-1ce55d6c-26a7-4e89-a49a-5eec66e9c1b9
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:45941
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 65574 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=859, count=40, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=626, count=31, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=701, count=36, min=17, max=26}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=827, count=35, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=627, count=38, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=775, count=19, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=556, count=30, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempaIbpsc/artifacts1yqAz0/job_ea499abe-3bee-4fbc-91e3-d7c1cf48c1f3/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempaIbpsc/artifacts1yqAz0/job_ea499abe-3bee-4fbc-91e3-d7c1cf48c1f3/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1217185204-9107c74c_092fe373-c318-47e8-9076-725809674794
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1217185204-9107c74c_092fe373-c318-47e8-9076-725809674794
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/17 18:53:18 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:45353
19/12/17 18:53:18 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:41315
19/12/17 18:53:18 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:35439
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/17 18:53:20 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1217185320-401b9eec_1183cd39-3dea-4a60-8722-0570435f0cea
19/12/17 18:53:20 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1217185320-401b9eec_1183cd39-3dea-4a60-8722-0570435f0cea
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/17 18:53:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/17 18:53:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/17 18:53:21 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/17 18:53:21 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/17 18:53:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1217185320-401b9eec_1183cd39-3dea-4a60-8722-0570435f0cea on Spark master local[4]
19/12/17 18:53:22 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/17 18:53:22 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:44333.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/17 18:53:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:41385.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:43017
19/12/17 18:53:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/17 18:53:25 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/17 18:54:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217185320-401b9eec_1183cd39-3dea-4a60-8722-0570435f0cea: Pipeline translated successfully. Computing outputs

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/17 18:54:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217185320-401b9eec_1183cd39-3dea-4a60-8722-0570435f0cea finished.
19/12/17 18:54:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/17 18:54:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temptN5GrR/artifactslHKGhx/job_9b6ba2c4-9225-4abb-8a43-a49718881b17/MANIFEST has 1 artifact locations
19/12/17 18:54:33 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temptN5GrR/artifactslHKGhx/job_9b6ba2c4-9225-4abb-8a43-a49718881b17/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/17 18:54:33 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1217185320-401b9eec_1183cd39-3dea-4a60-8722-0570435f0cea
19/12/17 18:54:33 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1217185320-401b9eec_1183cd39-3dea-4a60-8722-0570435f0cea
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576608873.665302585","description":"Error received from peer ipv4:127.0.0.1:44333","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576608873.664917980","description":"Error received from peer ipv4:127.0.0.1:41385","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576608873.664833163","description":"Error received from peer ipv4:127.0.0.1:43017","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576608873.664833163","description":"Error received from peer ipv4:127.0.0.1:43017","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3180.398s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 5s
120 actionable tasks: 97 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/kz6h4ziyl3sha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1243

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1243/display/redirect>

Changes:


------------------------------------------
[...truncated 1.48 MB...]
19/12/17 12:10:06 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1217121006-ddd508fe_a9a467df-f8d7-4781-83e3-c8c98a4c53d7
19/12/17 12:10:06 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1217121006-ddd508fe_a9a467df-f8d7-4781-83e3-c8c98a4c53d7
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/17 12:10:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/17 12:10:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/17 12:10:07 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/17 12:10:07 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/17 12:10:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1217121006-ddd508fe_a9a467df-f8d7-4781-83e3-c8c98a4c53d7 on Spark master local[4]
19/12/17 12:10:08 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/17 12:10:08 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:43739.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/17 12:10:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:33449.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:35553
19/12/17 12:10:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/17 12:10:11 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/17 12:10:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217121006-ddd508fe_a9a467df-f8d7-4781-83e3-c8c98a4c53d7: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/17 12:10:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217121006-ddd508fe_a9a467df-f8d7-4781-83e3-c8c98a4c53d7 finished.
19/12/17 12:10:28 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/17 12:10:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempYJlSyc/artifactsSspYtC/job_1c89a134-da8c-40bf-82c8-33c33b0f1434/MANIFEST has 1 artifact locations
19/12/17 12:10:28 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempYJlSyc/artifactsSspYtC/job_1c89a134-da8c-40bf-82c8-33c33b0f1434/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/17 12:10:28 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1217121006-ddd508fe_a9a467df-f8d7-4781-83e3-c8c98a4c53d7
19/12/17 12:10:28 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1217121006-ddd508fe_a9a467df-f8d7-4781-83e3-c8c98a4c53d7
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576584628.421174107","description":"Error received from peer ipv4:127.0.0.1:43739","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576584628.421144680","description":"Error received from peer ipv4:127.0.0.1:33449","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576584628.421112549","description":"Error received from peer ipv4:127.0.0.1:35553","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576584628.421112549","description":"Error received from peer ipv4:127.0.0.1:35553","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3500.759s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_21-14598409556339219614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_10_11-4724174876529057084?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_17_50-7644492503532938226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_24_32-7528920717080125825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_31_18-920729661060748599?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_38_16-8116784531841208574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_45_44-12618435694548068038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_52_56-5777418031517897712?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_23-8079637870068897271?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_16_29-15739076963441876689?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_25_03-16444555928718898201?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_32_47-10623854429196315159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_39_21-17682703136489041511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_20-16517733973850186736?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_19_43-10354057229883469523?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_37_08-7709668868563769319?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_22-17255752508784406966?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_14_23-15318943159458444537?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_21_11-5259541750105290383?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_27_18-12034733563264263331?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_33_26-8836419958904255803?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_21-13868974379590193871?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_18_42-6628364991890701462?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_26_03-14691404356591608848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_33_00-16450119963773065547?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_22-343804021566220889?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_09_55-15075607852796674225?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_17_50-8140752726496782411?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_24_02-3296914319307479858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_30_33-12196507852896461566?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_21-13984044336682726800?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_09_22-11619177622939576201?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_17_54-246081192542139247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_26_08-14996437569171241480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_33_12-6712442396978585921?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_02_21-2501214839503511647?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_10_53-2722058013647434266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_20_55-10091555267644476118?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_27_23-16589477015516804591?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-17_04_34_17-7459179672735910933?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 38s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/ymbazmm46yf4u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1242

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1242/display/redirect>

Changes:


------------------------------------------
[...truncated 1.44 MB...]
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=805, count=38, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=666, count=33, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=683, count=35, min=17, max=26}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=811, count=35, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=599, count=36, min=14, max=20}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=796, count=22, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=653, count=35, min=16, max=25}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp6PU5zz/artifactsyp0Nks/job_32ba331f-a2ff-45ab-b254-4d87a4d22c9a/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp6PU5zz/artifactsyp0Nks/job_32ba331f-a2ff-45ab-b254-4d87a4d22c9a/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1217061745-c231ca2d_b38aa998-8c17-4a7b-bc09-68c69f45fd4c
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1217061745-c231ca2d_b38aa998-8c17-4a7b-bc09-68c69f45fd4c
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/17 06:19:51 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:38465
19/12/17 06:19:51 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:34257
19/12/17 06:19:51 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:58179
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/17 06:19:52 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1217061952-b903c0c6_a0af4a84-773c-408e-8730-65ec16426f35
19/12/17 06:19:52 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1217061952-b903c0c6_a0af4a84-773c-408e-8730-65ec16426f35
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/17 06:19:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/17 06:19:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/17 06:19:54 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/17 06:19:54 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/17 06:19:55 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1217061952-b903c0c6_a0af4a84-773c-408e-8730-65ec16426f35 on Spark master local[4]
19/12/17 06:19:55 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/17 06:19:55 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:35375.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/17 06:19:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43025.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:37257
19/12/17 06:19:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/17 06:19:58 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/17 06:20:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217061952-b903c0c6_a0af4a84-773c-408e-8730-65ec16426f35: Pipeline translated successfully. Computing outputs

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/17 06:22:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217061952-b903c0c6_a0af4a84-773c-408e-8730-65ec16426f35 finished.
19/12/17 06:22:01 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/17 06:22:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temperyGPW/artifactsCX7Sba/job_869d01aa-0c63-4550-a741-04b281e63985/MANIFEST has 1 artifact locations
19/12/17 06:22:01 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temperyGPW/artifactsCX7Sba/job_869d01aa-0c63-4550-a741-04b281e63985/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/17 06:22:01 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1217061952-b903c0c6_a0af4a84-773c-408e-8730-65ec16426f35
19/12/17 06:22:01 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1217061952-b903c0c6_a0af4a84-773c-408e-8730-65ec16426f35
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576563721.862827579","description":"Error received from peer ipv4:127.0.0.1:43025","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576563721.862719807","description":"Error received from peer ipv4:127.0.0.1:37257","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576563721.862719807","description":"Error received from peer ipv4:127.0.0.1:37257","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576563721.863471056","description":"Error received from peer ipv4:127.0.0.1:35375","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3213.847s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_49-17053319517429881000?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_11_04-18334126960473816169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_18_40-17378830036648073017?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_25_04-6116154305081789001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_30_44-8891670370209041353?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_37_09-8308545990473704075?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_43_51-12682220865015565760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_50_20-13535358283392264735?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_51-11301747978455217210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_17_33-14358167163127605925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_25_38-16126114555957696684?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_32_32-9446373213031171360?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_48-7433491825847221415?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_22_43-3210524380524707206?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_28_53-16805885643869805571?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_35_25-1441526112803769064?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_50-13175607106710899159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_15_37-4771114502601329263?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_21_54-8810055733277437905?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_28_00-16004664209597779307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_34_06-16721136999798725652?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_48-1035162255388865055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_18_50-4562729417794332158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_25_58-9879724318379709850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_32_22-5303499743452434555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_38_41-7513803964015395093?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_48-17183469092393314434?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_11_11-793863489714617377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_18_41-10776442241263393111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_25_07-9778792741312615771?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_31_28-2341720005947201727?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_47-4840183979263855403?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_10_45-9787962864386584841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_18_59-13965592942867597770?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_25_52-18300716469776431996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_32_12-13603051913389303623?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_03_48-535313673114304470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_12_02-14965813861522602456?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_21_37-17491233853678331786?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_22_38_19-3825097224302469419?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 44s
120 actionable tasks: 95 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/m3vqhczyzukqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1241

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1241/display/redirect?page=changes>

Changes:

[chamikara] Merge pull request #10383: [BEAM-8575] Added a unit test to test that


------------------------------------------
[...truncated 1.41 MB...]
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-21959a0f-c9c3-4b2e-9701-29e0cb4f03b7
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-b96dbd6d-d40c-423a-beab-4065a8c1bf12
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:45243
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 37398 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=711, count=33, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=666, count=33, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=671, count=35, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=742, count=31, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=535, count=32, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=723, count=19, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=768, count=41, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempB7PKoE/artifactseyt4Nf/job_1514f131-7b58-442d-8798-89b0d52a1515/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempB7PKoE/artifactseyt4Nf/job_1514f131-7b58-442d-8798-89b0d52a1515/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1217030944-6e66e5d0_79d9492f-68ad-411e-99a2-02282eb4cb72
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1217030944-6e66e5d0_79d9492f-68ad-411e-99a2-02282eb4cb72
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/17 03:10:31 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:35843
19/12/17 03:10:31 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:38243
19/12/17 03:10:31 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:50465
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/17 03:10:33 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1217031033-4c675b86_c68cb71b-11a7-494d-a589-9b0db8c9da4c
19/12/17 03:10:33 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1217031033-4c675b86_c68cb71b-11a7-494d-a589-9b0db8c9da4c
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/17 03:10:34 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/17 03:10:34 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/17 03:10:34 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/17 03:10:34 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/17 03:10:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1217031033-4c675b86_c68cb71b-11a7-494d-a589-9b0db8c9da4c on Spark master local[4]
19/12/17 03:10:35 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/17 03:10:35 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37301.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/17 03:10:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43433.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:37767
19/12/17 03:10:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/17 03:10:38 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/17 03:11:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217031033-4c675b86_c68cb71b-11a7-494d-a589-9b0db8c9da4c: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/17 03:11:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217031033-4c675b86_c68cb71b-11a7-494d-a589-9b0db8c9da4c finished.
19/12/17 03:11:22 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/17 03:11:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp9crPaS/artifactsEWvIxw/job_31abc513-2c2c-4318-ab4a-fce934b4023c/MANIFEST has 1 artifact locations
19/12/17 03:11:22 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp9crPaS/artifactsEWvIxw/job_31abc513-2c2c-4318-ab4a-fce934b4023c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/17 03:11:22 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1217031033-4c675b86_c68cb71b-11a7-494d-a589-9b0db8c9da4c
19/12/17 03:11:22 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1217031033-4c675b86_c68cb71b-11a7-494d-a589-9b0db8c9da4c
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576552282.664099148","description":"Error received from peer ipv4:127.0.0.1:37767","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576552282.664099148","description":"Error received from peer ipv4:127.0.0.1:37767","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576552282.664108655","description":"Error received from peer ipv4:127.0.0.1:43433","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576552282.664604252","description":"Error received from peer ipv4:127.0.0.1:37301","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3265.330s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 23s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/eoe6tp6gtbl6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1240

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1240/display/redirect?page=changes>

Changes:

[aaltay] [BEAM-7390] Add code snippet for Latest (#10166)


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-92990e81-4efd-47bb-a4ce-8169d9874e97
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-9ad52f4d-c429-4a42-b4e8-64e06386d84d
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:46259
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 34912 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=728, count=34, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=666, count=33, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=734, count=38, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=846, count=36, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=675, count=40, min=14, max=20}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=715, count=19, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=723, count=39, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempjn9Nds/artifactsXtamss/job_b320b056-e6c5-4e5d-8b4d-e0af3170e332/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempjn9Nds/artifactsXtamss/job_b320b056-e6c5-4e5d-8b4d-e0af3170e332/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1217020239-81ce1fda_583088b7-613d-47ea-9265-802c0713efc0
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1217020239-81ce1fda_583088b7-613d-47ea-9265-802c0713efc0
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/17 02:03:23 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:45841
19/12/17 02:03:23 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:43851
19/12/17 02:03:23 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:59451
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/17 02:03:25 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1217020325-a7d0daab_a4fb12b7-fbc6-4351-9501-9d3c8af8513c
19/12/17 02:03:25 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1217020325-a7d0daab_a4fb12b7-fbc6-4351-9501-9d3c8af8513c
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/17 02:03:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/17 02:03:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/17 02:03:26 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/17 02:03:26 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/17 02:03:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1217020325-a7d0daab_a4fb12b7-fbc6-4351-9501-9d3c8af8513c on Spark master local[4]
19/12/17 02:03:27 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/17 02:03:27 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:44789.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/17 02:03:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:41291.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42469
19/12/17 02:03:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/17 02:03:30 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/17 02:03:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217020325-a7d0daab_a4fb12b7-fbc6-4351-9501-9d3c8af8513c: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/17 02:04:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1217020325-a7d0daab_a4fb12b7-fbc6-4351-9501-9d3c8af8513c finished.
19/12/17 02:04:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/17 02:04:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempuvPn8R/artifactsnKjD13/job_9c938949-6b51-4644-a48b-2c31f45bf6ad/MANIFEST has 1 artifact locations
19/12/17 02:04:05 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempuvPn8R/artifactsnKjD13/job_9c938949-6b51-4644-a48b-2c31f45bf6ad/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/17 02:04:05 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1217020325-a7d0daab_a4fb12b7-fbc6-4351-9501-9d3c8af8513c
19/12/17 02:04:05 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1217020325-a7d0daab_a4fb12b7-fbc6-4351-9501-9d3c8af8513c
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576548245.880064998","description":"Error received from peer ipv4:127.0.0.1:42469","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576548245.880107996","description":"Error received from peer ipv4:127.0.0.1:44789","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576548245.880090399","description":"Error received from peer ipv4:127.0.0.1:41291","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576548245.880064998","description":"Error received from peer ipv4:127.0.0.1:42469","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3350.519s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 1s
120 actionable tasks: 96 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/gx6g7jrwc35xg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1239

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1239/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-7970] Rebuild Go protos with new protobuf version


------------------------------------------
[...truncated 1.02 MB...]
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (1/2) (2c896762e70c48bbc50016444facf8a8) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 2c896762e70c48bbc50016444facf8a8.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (2/2) (attempt #0) to e1466e18-0851-49f5-b8e0-2526f1667600 @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (2/2).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (8542a5190fd2c0163bde8a60c82968c2) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (2c896762e70c48bbc50016444facf8a8) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) [DEPLOYING]
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) [DEPLOYING].
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (2c896762e70c48bbc50016444facf8a8) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) [DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5).
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 4420c8031e3519f031063150c866cec5.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (4420c8031e3519f031063150c866cec5) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-root-1217010701-43bc50dc (70fe7db0857160dad30cbc357126249a) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 70fe7db0857160dad30cbc357126249a reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-root-1217010701-43bc50dc(70fe7db0857160dad30cbc357126249a).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8137}, allocationId: f3581e55fe511993c1fb9b09747199ce, jobId: 70fe7db0857160dad30cbc357126249a).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 3792cceacac96ea0f88aecb1c3409eb0: JobManager is shutting down..
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager b7bf9bf9d55921a564e605220e754798@akka://flink/user/jobmanager_1 for job 70fe7db0857160dad30cbc357126249a from the resource manager.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 70fe7db0857160dad30cbc357126249a with leader id b7bf9bf9d55921a564e605220e754798 lost leadership.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8137}, allocationId: e96540a1d7a4f51f50d1e3d3aecae084, jobId: 70fe7db0857160dad30cbc357126249a).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 70fe7db0857160dad30cbc357126249a from job leader monitoring.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 70fe7db0857160dad30cbc357126249a.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 70fe7db0857160dad30cbc357126249a.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job 70fe7db0857160dad30cbc357126249a because it is not registered.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection 3792cceacac96ea0f88aecb1c3409eb0.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection e1466e18-0851-49f5-b8e0-2526f1667600 because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-7f635aaf-3f9a-4324-9be8-7adad2b1ea5e
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-595c4b8b-5278-4725-8ced-2e789bfc3e8b
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-3cf04afd-caae-4087-9dc6-c764af9676dd
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:36675
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 14343 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 4, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 15, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 8, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 1, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous)}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 8, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous)}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 15, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 4, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous).output}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5)Distributions(46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=72, count=3, min=24, max=24}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=15, count=1, min=15, max=15}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=168, count=12, min=14, max=14}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).output/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}))
[flink-runner-job-invoker] WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_29dd40d4-906f-4929-b675-d2b76760ccf3","basePath":"/tmp/beam-artifact-staging"}: {}
java.io.FileNotFoundException: /tmp/beam-artifact-staging/job_29dd40d4-906f-4929-b675-d2b76760ccf3/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3240.852s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_31-2823271527558151960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_06_40-18180850571204300387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_14_22-9298039415026113825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_21_11-5171780073582271432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_27_19-13159525292192905182?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_33_52-13344205338231771936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_40_15-13670305367526359348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_46_28-1714696568545266149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_36-17952522156565367567?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_13_43-1467780520151106833?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_22_23-13166979954390172254?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_29_19-8808062341063158372?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_36_04-8274779440945286643?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_30-16959813708930679511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_16_38-8996417010775224802?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_24_16-2518597797886273703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_31_18-5984111707412525985?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_31-10876990262648120015?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_16_59-17446672796817362226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_33_46-5802989510041430040?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_32-5600093188178961496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_11_46-13047545610930442829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_18_26-2130623603821382195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_24_42-13002670311920419904?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_31_01-1537006407554655046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_30-17076247535411359103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_06_15-15569137313585861516?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_15_01-4506486228606838065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_22_45-8194064964716159443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_30_05-11239797650009051352?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_32-17078693001941546461?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_07_13-13695867424981819224?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_14_33-11322038782492337785?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_21_13-4007451078085262161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_27_08-3266215804823937847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_16_59_31-196408339208041571?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_07_50-13463469875592877232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_17_51-4138903428130641134?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_24_46-6376573719709753647?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_17_31_07-5026364892384527131?project=apache-beam-testing

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 16s
117 actionable tasks: 92 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/kxu62plekzati

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python2 - Build # 1238 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python2 (build #1238)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python2/1238/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python2 #1237

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1237/display/redirect?page=changes>

Changes:

[suztomo] [BEAM-8865] Updating Javadoc of FileIO example

[pabloem] Removing LINT.If/LINT.Then statement

[bhulette] [BEAM-8801] PubsubMessageToRow should not check useFlatSchema() in


------------------------------------------
[...truncated 1.44 MB...]
[flink-akka.actor.default-dispatcher-129] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-120] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-120] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-120] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-129] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-129] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-120] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-129] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-19] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-19] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-19] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-19] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-19] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-132] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-132] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-132] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:35889
[flink-akka.actor.default-dispatcher-132] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 631418 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=738, count=35, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=747, count=37, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=648, count=34, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=845, count=36, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=640, count=38, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=665, count=17, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=644, count=34, min=16, max=22}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempa9GRz1/artifactsWCfxeF/job_73c34b31-1f65-41d3-b073-6706d00718f2/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempa9GRz1/artifactsWCfxeF/job_73c34b31-1f65-41d3-b073-6706d00718f2/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-4] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1216213518-3fa047cf_ac7da962-70a7-4de8-b0c0-7cf1dabb0d0e
[grpc-default-executor-4] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1216213518-3fa047cf_ac7da962-70a7-4de8-b0c0-7cf1dabb0d0e
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
WARNING:root:Waiting for grpc channel to be ready at localhost:58387.
WARNING:root:Waiting for grpc channel to be ready at localhost:58387.
WARNING:root:Waiting for grpc channel to be ready at localhost:58387.
19/12/16 21:46:29 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:34583
19/12/16 21:46:29 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:33919
19/12/16 21:46:29 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:58387
WARNING:root:Waiting for grpc channel to be ready at localhost:58387.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/16 21:46:34 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216214634-101bcaa7_05697039-cfab-40dd-80ee-93ff5f844858
19/12/16 21:46:34 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216214634-101bcaa7_05697039-cfab-40dd-80ee-93ff5f844858
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/16 21:46:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/16 21:46:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/16 21:46:39 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/16 21:46:40 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/16 21:46:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216214634-101bcaa7_05697039-cfab-40dd-80ee-93ff5f844858 on Spark master local[4]
19/12/16 21:46:43 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/16 21:46:43 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:46883.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/16 21:46:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37659.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:37631
19/12/16 21:46:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/16 21:46:51 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/16 21:49:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216214634-101bcaa7_05697039-cfab-40dd-80ee-93ff5f844858: Pipeline translated successfully. Computing outputs

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/16 21:50:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216214634-101bcaa7_05697039-cfab-40dd-80ee-93ff5f844858 finished.
19/12/16 21:50:49 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/16 21:50:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempxNNHY9/artifacts8FBijp/job_7947ea59-e92d-47de-a959-304b008649c2/MANIFEST has 1 artifact locations
19/12/16 21:50:49 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempxNNHY9/artifacts8FBijp/job_7947ea59-e92d-47de-a959-304b008649c2/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/16 21:50:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1216214634-101bcaa7_05697039-cfab-40dd-80ee-93ff5f844858
19/12/16 21:50:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1216214634-101bcaa7_05697039-cfab-40dd-80ee-93ff5f844858
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576533049.582694263","description":"Error received from peer ipv4:127.0.0.1:46883","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576533049.582676686","description":"Error received from peer ipv4:127.0.0.1:37659","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576533049.582638085","description":"Error received from peer ipv4:127.0.0.1:37631","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576533049.582638085","description":"Error received from peer ipv4:127.0.0.1:37631","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3460.137s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_44-269899280387354582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_21_08-3402493399295452377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_29_11-4769828025747522355?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_36_45-10828641390939822706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_43_42-1756822204393135204?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_50_41-9153173258525608509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_57_49-11368647271680776638?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_14_04_18-825380758461440637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_42-2867579507262963080?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_31_07-2533472119459369315?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_48_27-8234023346127681557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_48-4717191993187615240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_28_48-10625142762675916255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_37_37-13898710708023780063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_45_31-11402627054542326667?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_52_06-17066981363248340374?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_46-9548317425807050876?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_26_08-10796662350577455857?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_32_31-13231651412173929527?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_39_28-7596030769493567289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_46_10-15426718649850921766?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_43-16356812063072447006?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_30_07-5402163187410473004?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_38_32-823500100153375103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_46_05-6670185047444728422?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_42-3399281009777014938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_20_49-1541626292806593603?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_30_00-7542307600282280268?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_38_12-10117934881146190646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_45_22-3482676630452513056?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_42-10574107621344580815?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_22_26-8646131057867591609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_31_57-982375866994724511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_38_51-126169137305380182?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_46_07-16016079130188791705?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_13_43-12939363065857688552?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_21_34-13542713474217878583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_29_20-15516287329148048991?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_35_43-1617597680585524761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_13_43_05-3854721968721429466?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 57s
120 actionable tasks: 95 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/zc2iapx6qcgc6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1236

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1236/display/redirect?page=changes>

Changes:

[github] [BEAM-8446] Retrying BQ query on timeouts (#9855)


------------------------------------------
[...truncated 1001.90 KB...]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) switched from CREATED to SCHEDULED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (d1c620eec958477e529aca7137f0926a) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) d1c620eec958477e529aca7137f0926a.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (attempt #0) to fb1737e2-1d5f-41a6-a6c1-679fbe4ec667 @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (6c97b299f8103d782763c4a68ed17fd2) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (d1c620eec958477e529aca7137f0926a) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 974ce771ac77e42577265eca2667493e.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (1/2) (attempt #0) to fb1737e2-1d5f-41a6-a6c1-679fbe4ec667 @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (1/2).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (974ce771ac77e42577265eca2667493e) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) [DEPLOYING]
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d).
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) b2c2a66281d361559828d0bc93c3384d.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (b2c2a66281d361559828d0bc93c3384d) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (1cfa9cd680631a2235b30c8152b6605e) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (1cfa9cd680631a2235b30c8152b6605e).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (1cfa9cd680631a2235b30c8152b6605e) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 1cfa9cd680631a2235b30c8152b6605e.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (2/2) (attempt #0) to fb1737e2-1d5f-41a6-a6c1-679fbe4ec667 @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (2/2).
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) [DEPLOYING]
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) [DEPLOYING].
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (1cfa9cd680631a2235b30c8152b6605e) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) [DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b).
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 45b6a239a2fac8927d3253a5bb8daf0b.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (45b6a239a2fac8927d3253a5bb8daf0b) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-root-1216191427-ac4b729f (27a328e2eb5f8557a9a51117684d7daf) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 27a328e2eb5f8557a9a51117684d7daf reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-root-1216191427-ac4b729f(27a328e2eb5f8557a9a51117684d7daf).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection a7f5f27be820ca7d90012114b644079e: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8137}, allocationId: d1b3785f8793c832764eab2161e037d1, jobId: 27a328e2eb5f8557a9a51117684d7daf).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager bd7b23d6a33bed225a3984c29a1b492d@akka://flink/user/jobmanager_1 for job 27a328e2eb5f8557a9a51117684d7daf from the resource manager.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 27a328e2eb5f8557a9a51117684d7daf with leader id bd7b23d6a33bed225a3984c29a1b492d lost leadership.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8137}, allocationId: 5d62951888bb17508ae1c7cd04b14631, jobId: 27a328e2eb5f8557a9a51117684d7daf).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 27a328e2eb5f8557a9a51117684d7daf from job leader monitoring.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 27a328e2eb5f8557a9a51117684d7daf.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 27a328e2eb5f8557a9a51117684d7daf.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job 27a328e2eb5f8557a9a51117684d7daf because it is not registered.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection a7f5f27be820ca7d90012114b644079e.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection fb1737e2-1d5f-41a6-a6c1-679fbe4ec667 because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-d49d5fe6-73dd-478b-8d6c-e64f8c471b98
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-03c40138-b713-46fb-b92e-3b4b9ed96760
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-89693221-fd82-4b37-b4d9-8065b56ebbd4
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44503
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 14831 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 4, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 2, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 1, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2573>)_4}: 2, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:400>)_21}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous)}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2573>)_26}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:401>)_22}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous).output}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:396>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 37Map(<lambda at external_test.py:396>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=15, count=1, min=15, max=15}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=154, count=11, min=14, max=14}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=72, count=3, min=24, max=24}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}))
[flink-runner-job-invoker] WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_47d053fd-f137-471e-babd-d8f0ce4be746","basePath":"/tmp/beam-artifact-staging"}: {}
java.io.FileNotFoundException: /tmp/beam-artifact-staging/job_47d053fd-f137-471e-babd-d8f0ce4be746/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3398.206s

OK (SKIP=4)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 5s
117 actionable tasks: 97 executed, 17 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/4ei3kvezejlzs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1235

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1235/display/redirect?page=changes>

Changes:

[ningk] [BEAM-8837] Fix pcoll_visualization tests

[ningk] Use explicitly named dictionary when watching local scope variables in

[lukasz.gajowy] [BEAM-5495] Add possibility of scanning the classpath without

[lukasz.gajowy] [BEAM-5495] Enable injecting custom resources detection algorithms.

[lukasz.gajowy] [BEAM-5495] Move PipelineResources to

[ningk] Avoid patching test pcoll data and use real logic to produce test data.

[lukasz.gajowy] [BEAM-5495] Remove unneded and deprecated code

[lukasz.gajowy] [BEAM-5495] Use ClassGraph to detect classpath resources


------------------------------------------
[...truncated 1.43 MB...]
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-15a902f4-168d-4bda-a70d-92d26d31771b
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-20] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-20] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-20] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-20] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-20] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-20] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-19] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-19] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-19] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:40419
[flink-akka.actor.default-dispatcher-19] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 72656 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=761, count=36, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=793, count=39, min=18, max=27}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=690, count=36, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=874, count=37, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=730, count=44, min=14, max=20}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=810, count=21, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=667, count=36, min=16, max=22}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempgDuXPZ/artifactsTm4BdU/job_80135645-3d2d-4028-9399-bcbbd48c4e41/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempgDuXPZ/artifactsTm4BdU/job_80135645-3d2d-4028-9399-bcbbd48c4e41/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1216175225-6c0e42de_f3247dfe-5645-49fc-a0d9-7a3d28b4eea4
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1216175225-6c0e42de_f3247dfe-5645-49fc-a0d9-7a3d28b4eea4
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/16 17:53:47 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:37403
19/12/16 17:53:47 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:37283
19/12/16 17:53:47 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:58851
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/16 17:53:49 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216175349-30b68cc3_6b08c12a-c3e9-4800-b6ba-de8109ebc340
19/12/16 17:53:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216175349-30b68cc3_6b08c12a-c3e9-4800-b6ba-de8109ebc340
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/16 17:53:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/16 17:53:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/16 17:53:50 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/16 17:53:51 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/16 17:53:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216175349-30b68cc3_6b08c12a-c3e9-4800-b6ba-de8109ebc340 on Spark master local[4]
19/12/16 17:53:51 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/16 17:53:51 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:32857.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/16 17:53:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37405.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:35997
19/12/16 17:53:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/16 17:53:55 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/16 17:54:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216175349-30b68cc3_6b08c12a-c3e9-4800-b6ba-de8109ebc340: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/16 17:55:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216175349-30b68cc3_6b08c12a-c3e9-4800-b6ba-de8109ebc340 finished.
19/12/16 17:55:01 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/16 17:55:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempIZM0xu/artifacts00Xqyr/job_e791bd65-30c0-40b3-8e21-0b9db69f19fe/MANIFEST has 1 artifact locations
19/12/16 17:55:01 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempIZM0xu/artifacts00Xqyr/job_e791bd65-30c0-40b3-8e21-0b9db69f19fe/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/16 17:55:01 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1216175349-30b68cc3_6b08c12a-c3e9-4800-b6ba-de8109ebc340
19/12/16 17:55:01 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1216175349-30b68cc3_6b08c12a-c3e9-4800-b6ba-de8109ebc340
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576518901.891986029","description":"Error received from peer ipv4:127.0.0.1:35997","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576518901.891986029","description":"Error received from peer ipv4:127.0.0.1:35997","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576518901.892019737","description":"Error received from peer ipv4:127.0.0.1:37405","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576518901.892034994","description":"Error received from peer ipv4:127.0.0.1:32857","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3374.638s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 47s
120 actionable tasks: 99 executed, 18 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/zdbyienps5qny

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1234

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1234/display/redirect?page=changes>

Changes:

[mxm] [BEAM-8962] Add option to disable the metric container accumulator


------------------------------------------
[...truncated 1.48 MB...]
19/12/16 16:54:05 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:42441
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/16 16:54:07 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216165407-c0ce48eb_c0cdd84f-8334-4d7f-a0cd-27c2228fe65d
19/12/16 16:54:07 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216165407-c0ce48eb_c0cdd84f-8334-4d7f-a0cd-27c2228fe65d
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/16 16:54:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/16 16:54:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/16 16:54:07 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/16 16:54:07 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/16 16:54:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216165407-c0ce48eb_c0cdd84f-8334-4d7f-a0cd-27c2228fe65d on Spark master local[4]
19/12/16 16:54:08 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/16 16:54:08 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:36771.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/16 16:54:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:39359.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:43837
19/12/16 16:54:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/16 16:54:11 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/16 16:54:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216165407-c0ce48eb_c0cdd84f-8334-4d7f-a0cd-27c2228fe65d: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/16 16:55:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216165407-c0ce48eb_c0cdd84f-8334-4d7f-a0cd-27c2228fe65d finished.
19/12/16 16:55:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/16 16:55:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempMU_3IH/artifactsqaBz2S/job_949ba33c-e23a-4f84-8854-2d17fa3f1786/MANIFEST has 1 artifact locations
19/12/16 16:55:03 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempMU_3IH/artifactsqaBz2S/job_949ba33c-e23a-4f84-8854-2d17fa3f1786/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/16 16:55:03 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1216165407-c0ce48eb_c0cdd84f-8334-4d7f-a0cd-27c2228fe65d
19/12/16 16:55:03 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1216165407-c0ce48eb_c0cdd84f-8334-4d7f-a0cd-27c2228fe65d
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576515304.114044418","description":"Error received from peer ipv4:127.0.0.1:39359","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576515304.114080941","description":"Error received from peer ipv4:127.0.0.1:36771","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576515304.113955056","description":"Error received from peer ipv4:127.0.0.1:43837","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576515304.113955056","description":"Error received from peer ipv4:127.0.0.1:43837","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3313.338s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_43_59-6886172166209254600?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_51_14-2430119292534363809?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_58_10-9334989072148176925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_04_56-11653601158124173908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_11_31-9319393585152004424?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_17_51-2284594213840750739?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_25_06-13624653546768442820?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_31_36-14555675196892773575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_43_59-15072662634665106682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_02_54-294726393724959701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_09_20-18372476148048757366?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_16_05-13813947722493032071?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_44_02-17509642210399939451?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_56_12-15383310332293623879?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_02_24-13839435421833248172?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_08_31-1973492791790458018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_15_07-16079159231897189161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_44_04-14855581459925731216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_58_12-7209167439883009522?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_06_24-14900293718057974137?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_13_43-1644540987160614458?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_19_45-2490745502465920280?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_43_59-12044341447807434130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_02_11-6220465177867903733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_43_59-13771444920760979201?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_50_46-10695551185839409718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_58_47-4899249818190038698?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_06_12-3449587127951739546?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_13_01-13176717813969687081?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_43_59-18114382256374578412?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_50_48-15523024701563217533?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_58_15-3885129537316296705?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_04_29-13826773008951299968?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_10_52-16670768405765957690?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_18_22-17319329238926226943?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_44_00-7599976765336505625?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_51_57-14323924621828188277?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_00_55-1497164919243489799?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_08_35-7527725998643220653?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_15_17-3843590028317952738?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 29s
120 actionable tasks: 96 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/7ugvo6jmuaytk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python2 - Build # 1233 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python2 (build #1233)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python2/1233/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python2 #1232

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1232/display/redirect?page=changes>

Changes:

[suztomo] Declare JSR305 dependency as 'shadow'


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-73191dcd-1e82-4cb2-9e93-3bc69f83b35d
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-4dab22e2-f81c-4f8a-839e-37ab0ecf5853
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:39917
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 48412 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=725, count=34, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=680, count=34, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=627, count=33, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=824, count=35, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=572, count=34, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=749, count=20, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=755, count=41, min=16, max=25}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp5ENe5_/artifacts6qk8Ad/job_1cb8de59-f0f0-474f-8d7a-82125ca4b2ae/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp5ENe5_/artifacts6qk8Ad/job_1cb8de59-f0f0-474f-8d7a-82125ca4b2ae/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1216093937-927ac4b9_abb301cd-15a5-47aa-a242-795600af1fff
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1216093937-927ac4b9_abb301cd-15a5-47aa-a242-795600af1fff
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/16 09:40:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:34117
19/12/16 09:40:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:45053
19/12/16 09:40:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:41663
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/16 09:40:36 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216094036-9f79cc1d_5d88e011-b0be-4b80-8e69-113d77111534
19/12/16 09:40:36 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216094036-9f79cc1d_5d88e011-b0be-4b80-8e69-113d77111534
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/16 09:40:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/16 09:40:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/16 09:40:36 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/16 09:40:37 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/16 09:40:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216094036-9f79cc1d_5d88e011-b0be-4b80-8e69-113d77111534 on Spark master local[4]
19/12/16 09:40:38 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/16 09:40:38 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:38039.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/16 09:40:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43237.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42497
19/12/16 09:40:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/16 09:40:44 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/16 09:41:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216094036-9f79cc1d_5d88e011-b0be-4b80-8e69-113d77111534: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/16 09:41:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216094036-9f79cc1d_5d88e011-b0be-4b80-8e69-113d77111534 finished.
19/12/16 09:41:39 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/16 09:41:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temprJJQGG/artifactszuSp9e/job_dbf6623a-7cd2-4088-b2ba-efaf930293bd/MANIFEST has 1 artifact locations
19/12/16 09:41:39 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temprJJQGG/artifactszuSp9e/job_dbf6623a-7cd2-4088-b2ba-efaf930293bd/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/16 09:41:39 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1216094036-9f79cc1d_5d88e011-b0be-4b80-8e69-113d77111534
19/12/16 09:41:39 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1216094036-9f79cc1d_5d88e011-b0be-4b80-8e69-113d77111534
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576489299.477195718","description":"Error received from peer ipv4:127.0.0.1:43237","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576489299.477222980","description":"Error received from peer ipv4:127.0.0.1:38039","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576489299.477152974","description":"Error received from peer ipv4:127.0.0.1:42497","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576489299.477152974","description":"Error received from peer ipv4:127.0.0.1:42497","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3379.178s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 41s
120 actionable tasks: 100 executed, 17 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/jbcq6oikci3r6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1231

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1231/display/redirect>

Changes:


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-a9e633fd-3fff-4c34-9963-e253aff7a905
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-c5ac0414-a5ce-4f05-b564-6dc729e00109
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-b1b9bb43-424e-4f28-adce-cb35b8bde6f1
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44407
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 53917 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=764, count=36, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=674, count=33, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=693, count=36, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=828, count=35, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=715, count=43, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=743, count=20, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=627, count=34, min=16, max=22}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp0nqCdm/artifactsoTQSwN/job_f04c1eac-4105-41ad-b07d-125501a0ef9b/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp0nqCdm/artifactsoTQSwN/job_f04c1eac-4105-41ad-b07d-125501a0ef9b/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1216061331-8bf4da47_a9fd3ea3-3c55-4e75-9f5c-016104346618
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1216061331-8bf4da47_a9fd3ea3-3c55-4e75-9f5c-016104346618
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/16 06:14:33 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:43471
19/12/16 06:14:33 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:42175
19/12/16 06:14:33 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:52391
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/16 06:14:35 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216061435-8a1818d_32b47211-aa5a-4fce-93aa-f763b8b2cd70
19/12/16 06:14:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216061435-8a1818d_32b47211-aa5a-4fce-93aa-f763b8b2cd70
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/16 06:14:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/16 06:14:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/16 06:14:35 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/16 06:14:36 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/16 06:14:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216061435-8a1818d_32b47211-aa5a-4fce-93aa-f763b8b2cd70 on Spark master local[4]
19/12/16 06:14:36 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/16 06:14:36 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:39717.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/16 06:14:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37429.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:40283
19/12/16 06:14:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/16 06:14:39 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/16 06:15:06 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216061435-8a1818d_32b47211-aa5a-4fce-93aa-f763b8b2cd70: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/16 06:15:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216061435-8a1818d_32b47211-aa5a-4fce-93aa-f763b8b2cd70 finished.
19/12/16 06:15:32 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/16 06:15:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempukqhqM/artifactsb7_SHC/job_bdd8ed98-f324-42bc-99e2-3db9220d3f1c/MANIFEST has 1 artifact locations
19/12/16 06:15:32 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempukqhqM/artifactsb7_SHC/job_bdd8ed98-f324-42bc-99e2-3db9220d3f1c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/16 06:15:32 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1216061435-8a1818d_32b47211-aa5a-4fce-93aa-f763b8b2cd70
19/12/16 06:15:32 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1216061435-8a1818d_32b47211-aa5a-4fce-93aa-f763b8b2cd70
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576476932.745667714","description":"Error received from peer ipv4:127.0.0.1:40283","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576476932.745701048","description":"Error received from peer ipv4:127.0.0.1:37429","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576476932.745667714","description":"Error received from peer ipv4:127.0.0.1:40283","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576476932.745717469","description":"Error received from peer ipv4:127.0.0.1:39717","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3346.105s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 16s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/ijtytqnce2w66

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1230

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1230/display/redirect>

Changes:


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-dcc9078f-e1b6-4be1-b3d3-1e0d1dec977b
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44699
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 15909 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=695, count=33, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=758, count=37, min=18, max=27}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=688, count=36, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=826, count=35, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=556, count=33, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=742, count=19, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=650, count=34, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempwhuDfh/artifacts9yARVC/job_f7d194ab-fd62-4f1e-bb03-e52e491802a6/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempwhuDfh/artifacts9yARVC/job_f7d194ab-fd62-4f1e-bb03-e52e491802a6/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1216000915-9f836dd8_cbf3d1f1-c32a-4b88-9bc6-4101d897a01c
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1216000915-9f836dd8_cbf3d1f1-c32a-4b88-9bc6-4101d897a01c
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/16 00:09:39 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:36511
19/12/16 00:09:40 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:32985
19/12/16 00:09:40 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:59731
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/16 00:09:41 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216000941-a695a709_8e45a279-063b-4327-8fa2-4474aaeda06e
19/12/16 00:09:41 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216000941-a695a709_8e45a279-063b-4327-8fa2-4474aaeda06e
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/16 00:09:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/16 00:09:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/16 00:09:42 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/16 00:09:42 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/16 00:09:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216000941-a695a709_8e45a279-063b-4327-8fa2-4474aaeda06e on Spark master local[4]
19/12/16 00:09:43 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/16 00:09:43 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:35093.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/16 00:09:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:36731.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:46109
19/12/16 00:09:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/16 00:09:46 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/16 00:09:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216000941-a695a709_8e45a279-063b-4327-8fa2-4474aaeda06e: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/16 00:10:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216000941-a695a709_8e45a279-063b-4327-8fa2-4474aaeda06e finished.
19/12/16 00:10:02 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/16 00:10:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempBl6z6s/artifactsbctl4H/job_0426db84-e39f-4007-942f-21fa89a5419f/MANIFEST has 1 artifact locations
19/12/16 00:10:02 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempBl6z6s/artifactsbctl4H/job_0426db84-e39f-4007-942f-21fa89a5419f/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/16 00:10:02 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1216000941-a695a709_8e45a279-063b-4327-8fa2-4474aaeda06e
19/12/16 00:10:02 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1216000941-a695a709_8e45a279-063b-4327-8fa2-4474aaeda06e
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576455002.835313887","description":"Error received from peer ipv4:127.0.0.1:46109","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576455002.835313887","description":"Error received from peer ipv4:127.0.0.1:46109","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576455002.835402873","description":"Error received from peer ipv4:127.0.0.1:36731","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576455002.835527888","description":"Error received from peer ipv4:127.0.0.1:35093","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3330.015s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 41s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/gujiwabrskbuk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1229

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1229/display/redirect>

Changes:


------------------------------------------
[...truncated 1.40 MB...]
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-3d44b2b4-b48f-44d9-aed5-7f0bb6227cdf
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:37713
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 44048 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=770, count=36, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=733, count=36, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=557, count=29, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=798, count=34, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=506, count=30, min=14, max=23}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=764, count=21, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=747, count=40, min=16, max=25}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempNhxJc8/artifactsLeId3O/job_9c70dd77-5450-405e-b0a5-d192db8da74d/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempNhxJc8/artifactsLeId3O/job_9c70dd77-5450-405e-b0a5-d192db8da74d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1215181321-6dacb739_52b92bbf-3950-4cdf-8a1e-c192d507bf8b
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1215181321-6dacb739_52b92bbf-3950-4cdf-8a1e-c192d507bf8b
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/15 18:14:13 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:42683
19/12/15 18:14:13 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:46615
19/12/15 18:14:14 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:38471
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/15 18:14:15 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1215181415-dd7e7119_a5f5805e-9f5a-4659-9fc3-ef98137a58fa
19/12/15 18:14:16 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1215181415-dd7e7119_a5f5805e-9f5a-4659-9fc3-ef98137a58fa
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/15 18:14:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/15 18:14:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/15 18:14:16 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/15 18:14:16 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/15 18:14:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1215181415-dd7e7119_a5f5805e-9f5a-4659-9fc3-ef98137a58fa on Spark master local[4]
19/12/15 18:14:17 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/15 18:14:17 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:45131.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/15 18:14:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:39691.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42511
19/12/15 18:14:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/15 18:14:20 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/15 18:14:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215181415-dd7e7119_a5f5805e-9f5a-4659-9fc3-ef98137a58fa: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/15 18:14:59 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215181415-dd7e7119_a5f5805e-9f5a-4659-9fc3-ef98137a58fa finished.
19/12/15 18:14:59 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/15 18:14:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempzIMrB3/artifactsLLmgHJ/job_1591e56e-8e0c-40fa-9ecf-75bbeb231b3e/MANIFEST has 1 artifact locations
19/12/15 18:14:59 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempzIMrB3/artifactsLLmgHJ/job_1591e56e-8e0c-40fa-9ecf-75bbeb231b3e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/15 18:14:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1215181415-dd7e7119_a5f5805e-9f5a-4659-9fc3-ef98137a58fa
19/12/15 18:14:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1215181415-dd7e7119_a5f5805e-9f5a-4659-9fc3-ef98137a58fa
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576433699.717181210","description":"Error received from peer ipv4:127.0.0.1:39691","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576433699.717193035","description":"Error received from peer ipv4:127.0.0.1:45131","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576433699.717157585","description":"Error received from peer ipv4:127.0.0.1:42511","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576433699.717157585","description":"Error received from peer ipv4:127.0.0.1:42511","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3254.457s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 27s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/j4skic24tuzqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1228

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1228/display/redirect>

Changes:


------------------------------------------
[...truncated 1.48 MB...]
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/15 12:09:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/15 12:09:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/15 12:09:36 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/15 12:09:36 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/15 12:09:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1215120936-9040dbbc_31231b32-0697-424a-950a-c87c1d036ce8 on Spark master local[4]
19/12/15 12:09:37 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/15 12:09:37 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37191.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/15 12:09:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43795.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42999
19/12/15 12:09:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/15 12:09:40 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/15 12:09:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215120936-9040dbbc_31231b32-0697-424a-950a-c87c1d036ce8: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/15 12:09:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215120936-9040dbbc_31231b32-0697-424a-950a-c87c1d036ce8 finished.
19/12/15 12:09:56 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/15 12:09:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempIsxxXa/artifactsFq3EP0/job_89478ec4-5fff-4a77-9d24-60acc3da8480/MANIFEST has 1 artifact locations
19/12/15 12:09:56 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempIsxxXa/artifactsFq3EP0/job_89478ec4-5fff-4a77-9d24-60acc3da8480/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/15 12:09:56 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1215120936-9040dbbc_31231b32-0697-424a-950a-c87c1d036ce8
19/12/15 12:09:56 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1215120936-9040dbbc_31231b32-0697-424a-950a-c87c1d036ce8
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576411796.763994868","description":"Error received from peer ipv4:127.0.0.1:42999","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576411796.763994868","description":"Error received from peer ipv4:127.0.0.1:42999","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576411796.764032739","description":"Error received from peer ipv4:127.0.0.1:43795","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576411796.764056123","description":"Error received from peer ipv4:127.0.0.1:37191","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3349.460s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_48-9412082939406391837?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_09_08-17927721705181114564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_16_37-15706564244127310441?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_23_33-12463447842325441575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_30_05-5475146711168636614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_37_01-14899734519365126611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_44_13-10615271214537249574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_50_29-4697404834717944491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_53-10593685174992454541?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_16_14-18078059545056829133?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_24_33-4697610121665058612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_32_08-4874680422652082498?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_38_55-788945029206579270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_48-4793862878148131495?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_19_37-11872291532740415156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_37_02-9607475317053302859?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_52-5134765865642485891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_14_08-17202852403905175250?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_20_46-9991277387400409428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_27_34-12754515572193542301?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_33_51-5177329545386325212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_49-16288860148334144737?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_18_14-17025367547834102338?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_26_10-16653743490615392695?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_33_09-8409117812907229534?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_47-17437912902138661161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_08_37-15941310966022477694?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_17_11-18126657295592546840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_23_45-12829468455750965513?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_30_09-10725131619104881716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_50-3474306805766727463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_10_42-11750929727572647566?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_21_04-8188697324661097001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_27_47-4614057992483751453?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_34_27-4151139599194454238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_01_50-15301077021045134206?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_09_58-3837675489763952948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_17_53-10544227649022605866?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_25_09-3675480032418106998?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-15_04_32_03-5938639250118441245?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 59s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/ibxe7nyyijstu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1227

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1227/display/redirect>

Changes:


------------------------------------------
[...truncated 1.48 MB...]
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44487
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 54603 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=722, count=34, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=673, count=33, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=674, count=35, min=17, max=26}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=835, count=35, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=583, count=35, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=627, count=17, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=755, count=41, min=16, max=22}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempLweQUH/artifactszDmuJT/job_b75a8973-97c9-404d-ba27-a5accc5bc067/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempLweQUH/artifactszDmuJT/job_b75a8973-97c9-404d-ba27-a5accc5bc067/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1215061320-f72d8780_8f6e1448-7f39-4b98-9492-3ded204fd48c
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1215061320-f72d8780_8f6e1448-7f39-4b98-9492-3ded204fd48c
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/15 06:14:24 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:34349
19/12/15 06:14:24 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:34905
19/12/15 06:14:24 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:34411
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/15 06:14:25 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1215061425-f0438a23_cd4f9d78-09e0-4700-824e-78fc36733e8f
19/12/15 06:14:25 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1215061425-f0438a23_cd4f9d78-09e0-4700-824e-78fc36733e8f
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/15 06:14:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/15 06:14:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/15 06:14:26 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/15 06:14:26 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/15 06:14:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1215061425-f0438a23_cd4f9d78-09e0-4700-824e-78fc36733e8f on Spark master local[4]
19/12/15 06:14:27 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/15 06:14:27 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42275.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/15 06:14:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:34839.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:38839
19/12/15 06:14:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/15 06:14:30 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/15 06:14:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215061425-f0438a23_cd4f9d78-09e0-4700-824e-78fc36733e8f: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/15 06:15:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215061425-f0438a23_cd4f9d78-09e0-4700-824e-78fc36733e8f finished.
19/12/15 06:15:23 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/15 06:15:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempHbHH2t/artifacts4ClRmb/job_99553db6-ddf8-499d-9ff7-8a38d524631f/MANIFEST has 1 artifact locations
19/12/15 06:15:23 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempHbHH2t/artifacts4ClRmb/job_99553db6-ddf8-499d-9ff7-8a38d524631f/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/15 06:15:23 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1215061425-f0438a23_cd4f9d78-09e0-4700-824e-78fc36733e8f
19/12/15 06:15:23 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1215061425-f0438a23_cd4f9d78-09e0-4700-824e-78fc36733e8f
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576390524.176377870","description":"Error received from peer ipv4:127.0.0.1:34839","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576390524.176414538","description":"Error received from peer ipv4:127.0.0.1:42275","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576390524.176401925","description":"Error received from peer ipv4:127.0.0.1:38839","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576390524.176401925","description":"Error received from peer ipv4:127.0.0.1:38839","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3405.028s

OK (SKIP=4)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 141

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:crossLanguagePortableWordCount'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 8s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/jubd5ohd4mp7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1226

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1226/display/redirect>

Changes:


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-c150c53e-b081-4aaf-9158-c47143783985
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:34709
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 15799 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=744, count=35, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=704, count=35, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=615, count=32, min=17, max=26}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=732, count=31, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=579, count=35, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=758, count=21, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=683, count=37, min=16, max=22}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp9zB8PY/artifactsLAOX3Z/job_fd6e21bb-e3c3-4181-8889-ae7b47ba58e5/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp9zB8PY/artifactsLAOX3Z/job_fd6e21bb-e3c3-4181-8889-ae7b47ba58e5/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1215000905-de05b87d_b268d5ea-5253-4540-98fc-681675085c2a
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1215000905-de05b87d_b268d5ea-5253-4540-98fc-681675085c2a
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/15 00:09:28 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:43593
19/12/15 00:09:28 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:37461
19/12/15 00:09:28 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:39617
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/15 00:09:30 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1215000930-b86994e6_ee04c738-ef25-4da7-9239-f24cc71104f0
19/12/15 00:09:30 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1215000930-b86994e6_ee04c738-ef25-4da7-9239-f24cc71104f0
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/15 00:09:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/15 00:09:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/15 00:09:31 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/15 00:09:31 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/15 00:09:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1215000930-b86994e6_ee04c738-ef25-4da7-9239-f24cc71104f0 on Spark master local[4]
19/12/15 00:09:32 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/15 00:09:32 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:46785.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/15 00:09:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:34463.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36467
19/12/15 00:09:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/15 00:09:35 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/15 00:09:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215000930-b86994e6_ee04c738-ef25-4da7-9239-f24cc71104f0: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/15 00:09:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215000930-b86994e6_ee04c738-ef25-4da7-9239-f24cc71104f0 finished.
19/12/15 00:09:50 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/15 00:09:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempj34SJN/artifacts5vsFPe/job_f1dbf21b-7f62-41e3-8ce9-d86e2528234d/MANIFEST has 1 artifact locations
19/12/15 00:09:50 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempj34SJN/artifacts5vsFPe/job_f1dbf21b-7f62-41e3-8ce9-d86e2528234d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/15 00:09:50 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1215000930-b86994e6_ee04c738-ef25-4da7-9239-f24cc71104f0
19/12/15 00:09:50 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1215000930-b86994e6_ee04c738-ef25-4da7-9239-f24cc71104f0
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576368591.224213655","description":"Error received from peer ipv4:127.0.0.1:46785","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576368591.224211551","description":"Error received from peer ipv4:127.0.0.1:34463","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576368591.224157209","description":"Error received from peer ipv4:127.0.0.1:36467","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576368591.224157209","description":"Error received from peer ipv4:127.0.0.1:36467","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3214.343s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 40s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/qqjpx2seksohe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1225

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1225/display/redirect>

Changes:


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-4f2d4be9-3ffb-4fdc-a2f8-2d4c3b46e194
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-908b67d6-8173-463d-a615-ed271d8689eb
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:37515
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 17157 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=854, count=40, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=718, count=35, min=18, max=27}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=650, count=34, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=750, count=32, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=572, count=35, min=14, max=20}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=781, count=22, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=744, count=40, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp80fFGi/artifactswD9y80/job_8bfeb72e-aeec-4c85-9a2c-137feb189a61/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp80fFGi/artifactswD9y80/job_8bfeb72e-aeec-4c85-9a2c-137feb189a61/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1214180852-c63d9cc1_1ac83385-3d0a-455c-8363-78479838052b
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1214180852-c63d9cc1_1ac83385-3d0a-455c-8363-78479838052b
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/14 18:09:18 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:36647
19/12/14 18:09:18 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:39615
19/12/14 18:09:18 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:46013
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/14 18:09:19 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214180919-ebbe8eaa_bb250935-a484-4f47-a7fb-4fd6e400a2c0
19/12/14 18:09:19 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214180919-ebbe8eaa_bb250935-a484-4f47-a7fb-4fd6e400a2c0
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/14 18:09:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/14 18:09:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/14 18:09:20 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/14 18:09:20 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/14 18:09:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214180919-ebbe8eaa_bb250935-a484-4f47-a7fb-4fd6e400a2c0 on Spark master local[4]
19/12/14 18:09:21 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/14 18:09:21 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42381.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/14 18:09:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:33799.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:41129
19/12/14 18:09:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/14 18:09:24 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/14 18:09:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214180919-ebbe8eaa_bb250935-a484-4f47-a7fb-4fd6e400a2c0: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/14 18:09:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214180919-ebbe8eaa_bb250935-a484-4f47-a7fb-4fd6e400a2c0 finished.

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
19/12/14 18:09:41 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/14 18:09:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempuUxv28/artifactsFfh7eb/job_2e75e4ec-caa1-4518-9b73-64771a46630a/MANIFEST has 1 artifact locations
19/12/14 18:09:41 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempuUxv28/artifactsFfh7eb/job_2e75e4ec-caa1-4518-9b73-64771a46630a/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/14 18:09:41 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1214180919-ebbe8eaa_bb250935-a484-4f47-a7fb-4fd6e400a2c0
19/12/14 18:09:41 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1214180919-ebbe8eaa_bb250935-a484-4f47-a7fb-4fd6e400a2c0
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576346981.787431110","description":"Error received from peer ipv4:127.0.0.1:41129","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576346981.787431110","description":"Error received from peer ipv4:127.0.0.1:41129","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576346981.787355208","description":"Error received from peer ipv4:127.0.0.1:33799","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576346981.787504727","description":"Error received from peer ipv4:127.0.0.1:42381","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3252.288s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 13s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/l5unesw2dff5q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1224

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1224/display/redirect>

Changes:


------------------------------------------
[...truncated 1.42 MB...]
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-65996922-b67c-4327-9ee6-7d2dc2393481
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:36633
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 15542 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=727, count=34, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=775, count=38, min=18, max=27}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=694, count=36, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=876, count=37, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=584, count=35, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=673, count=17, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=659, count=35, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp5ujKZ4/artifactsW17bfV/job_18e772c0-2f20-4084-b90a-41c44e10114b/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp5ujKZ4/artifactsW17bfV/job_18e772c0-2f20-4084-b90a-41c44e10114b/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1214120917-3ad03ff7_9ed38585-b3ad-44e4-9299-f7bc4a87846e
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1214120917-3ad03ff7_9ed38585-b3ad-44e4-9299-f7bc4a87846e
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/14 12:09:41 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:41995
19/12/14 12:09:41 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:35131
19/12/14 12:09:41 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:50107
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/14 12:09:43 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214120943-66e186e6_b00c09b0-0693-4e8c-84b3-8e07d43a2973
19/12/14 12:09:43 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214120943-66e186e6_b00c09b0-0693-4e8c-84b3-8e07d43a2973
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/14 12:09:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/14 12:09:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/14 12:09:43 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/14 12:09:44 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/14 12:09:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214120943-66e186e6_b00c09b0-0693-4e8c-84b3-8e07d43a2973 on Spark master local[4]
19/12/14 12:09:44 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/14 12:09:45 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:38479.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/14 12:09:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:39761.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:46497
19/12/14 12:09:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/14 12:09:47 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/14 12:09:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214120943-66e186e6_b00c09b0-0693-4e8c-84b3-8e07d43a2973: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/14 12:10:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214120943-66e186e6_b00c09b0-0693-4e8c-84b3-8e07d43a2973 finished.
19/12/14 12:10:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/14 12:10:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temprhBocf/artifactsiGZMK_/job_86d25b02-4946-4476-ab21-46e6f56373ea/MANIFEST has 1 artifact locations
19/12/14 12:10:03 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temprhBocf/artifactsiGZMK_/job_86d25b02-4946-4476-ab21-46e6f56373ea/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/14 12:10:03 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1214120943-66e186e6_b00c09b0-0693-4e8c-84b3-8e07d43a2973
19/12/14 12:10:03 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1214120943-66e186e6_b00c09b0-0693-4e8c-84b3-8e07d43a2973
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576325403.642840670","description":"Error received from peer ipv4:127.0.0.1:38479","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576325403.642774945","description":"Error received from peer ipv4:127.0.0.1:46497","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576325403.642774945","description":"Error received from peer ipv4:127.0.0.1:46497","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576325403.642824681","description":"Error received from peer ipv4:127.0.0.1:39761","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3239.958s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 8s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/7uvwhbf26ppk4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1223

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1223/display/redirect>

Changes:


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-c7c035cf-0b04-4269-a02d-58b332bf4904
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:36461
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 57230 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=760, count=36, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=767, count=38, min=18, max=27}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=655, count=34, min=17, max=26}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=817, count=35, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=569, count=34, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=737, count=19, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=702, count=38, min=16, max=25}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temppsAd4o/artifactsfi_Elc/job_d3e033d7-cc17-4cad-8286-9d843fb18e7e/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temppsAd4o/artifactsfi_Elc/job_d3e033d7-cc17-4cad-8286-9d843fb18e7e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1214061007-643dd379_5c82ffed-3021-4d10-b419-a0f5c5d24a49
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1214061007-643dd379_5c82ffed-3021-4d10-b419-a0f5c5d24a49
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/14 06:11:13 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:38815
19/12/14 06:11:13 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:42709
19/12/14 06:11:13 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:55625
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/14 06:11:15 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214061115-66b73e89_ad5061c7-9e7c-4fff-aeab-4c3e70e8893a
19/12/14 06:11:15 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214061115-66b73e89_ad5061c7-9e7c-4fff-aeab-4c3e70e8893a
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/14 06:11:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/14 06:11:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/14 06:11:15 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/14 06:11:16 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/14 06:11:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214061115-66b73e89_ad5061c7-9e7c-4fff-aeab-4c3e70e8893a on Spark master local[4]
19/12/14 06:11:16 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/14 06:11:16 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:43315.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/14 06:11:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:36849.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36835
19/12/14 06:11:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/14 06:11:19 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/14 06:11:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214061115-66b73e89_ad5061c7-9e7c-4fff-aeab-4c3e70e8893a: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/14 06:12:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214061115-66b73e89_ad5061c7-9e7c-4fff-aeab-4c3e70e8893a finished.
19/12/14 06:12:12 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/14 06:12:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempq_az_R/artifactsHHNFPx/job_d019ec25-e05d-490a-a8e0-ad9952a9c27d/MANIFEST has 1 artifact locations
19/12/14 06:12:12 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempq_az_R/artifactsHHNFPx/job_d019ec25-e05d-490a-a8e0-ad9952a9c27d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/14 06:12:12 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1214061115-66b73e89_ad5061c7-9e7c-4fff-aeab-4c3e70e8893a
19/12/14 06:12:12 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1214061115-66b73e89_ad5061c7-9e7c-4fff-aeab-4c3e70e8893a
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576303933.239072094","description":"Error received from peer ipv4:127.0.0.1:43315","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576303933.239048773","description":"Error received from peer ipv4:127.0.0.1:36849","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576303933.239003118","description":"Error received from peer ipv4:127.0.0.1:36835","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576303933.239003118","description":"Error received from peer ipv4:127.0.0.1:36835","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3446.235s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 1s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/tl4e7dchnkhz6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1222

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1222/display/redirect?page=changes>

Changes:

[robertwb] Fixed instructions and the example of running Interactive Beam on Flink.


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-4cfbe3ca-ca16-461d-89f5-37289746d7fb
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:38379
[flink-akka.actor.default-dispatcher-10] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 37754 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=782, count=37, min=19, max=25}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=715, count=35, min=18, max=27}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=715, count=37, min=17, max=26}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=948, count=40, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=569, count=34, min=14, max=23}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=740, count=21, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=638, count=34, min=16, max=25}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempwHRKn1/artifactsdHrhVB/job_da7c0957-a572-4b84-8ef1-c3be6d90af27/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempwHRKn1/artifactsdHrhVB/job_da7c0957-a572-4b84-8ef1-c3be6d90af27/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1214014612-2d846502_0dd5a69d-bf08-4cb9-bb2d-a9a92fdd10bc
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1214014612-2d846502_0dd5a69d-bf08-4cb9-bb2d-a9a92fdd10bc
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/14 01:46:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:32797
19/12/14 01:46:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:41227
19/12/14 01:46:59 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:48707
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/14 01:47:00 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214014700-b36fef5c_a955a62d-99d1-4af3-88ae-c71bf96d72dd
19/12/14 01:47:00 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214014700-b36fef5c_a955a62d-99d1-4af3-88ae-c71bf96d72dd
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/14 01:47:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/14 01:47:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/14 01:47:00 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/14 01:47:01 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/14 01:47:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214014700-b36fef5c_a955a62d-99d1-4af3-88ae-c71bf96d72dd on Spark master local[4]
19/12/14 01:47:02 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/14 01:47:02 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:40077.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/14 01:47:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40075.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:44185
19/12/14 01:47:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/14 01:47:05 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/14 01:47:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214014700-b36fef5c_a955a62d-99d1-4af3-88ae-c71bf96d72dd: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/14 01:47:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214014700-b36fef5c_a955a62d-99d1-4af3-88ae-c71bf96d72dd finished.
19/12/14 01:47:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/14 01:47:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp8Lg45s/artifactstQGz3F/job_daf10139-800a-453d-8d38-1c4844c0b76b/MANIFEST has 1 artifact locations
19/12/14 01:47:47 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp8Lg45s/artifactstQGz3F/job_daf10139-800a-453d-8d38-1c4844c0b76b/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/14 01:47:47 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1214014700-b36fef5c_a955a62d-99d1-4af3-88ae-c71bf96d72dd
19/12/14 01:47:47 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1214014700-b36fef5c_a955a62d-99d1-4af3-88ae-c71bf96d72dd
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576288067.845271151","description":"Error received from peer ipv4:127.0.0.1:40075","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576288067.843968201","description":"Error received from peer ipv4:127.0.0.1:44185","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576288067.843968201","description":"Error received from peer ipv4:127.0.0.1:44185","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576288067.845345914","description":"Error received from peer ipv4:127.0.0.1:40077","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3223.770s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 6s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/xsyzdechqjuey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1221

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1221/display/redirect>

Changes:


------------------------------------------
[...truncated 1.42 MB...]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-e84fcf5d-1f5b-4b5c-b540-d23b3c676c3e
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:43943
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 15908 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 44, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_10:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_31}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=words}: 96, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_13}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_8}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2573>)_20}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_7}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_lengths}: 298, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_13:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 27, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 96, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 44, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=790, count=37, min=19, max=28}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=271, count=1, min=271, max=271}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=945, count=1, min=945, max=945}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=672, count=33, min=18, max=24}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=727, count=38, min=17, max=23}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=278, count=2, min=139, max=139}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=822, count=35, min=20, max=29}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=276, count=2, min=138, max=138}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=81, count=1, min=81, max=81}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=525, count=31, min=14, max=23}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=783, count=20, min=14, max=84}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=1112, count=1, min=1112, max=1112}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=677, count=36, min=16, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=13, count=1, min=13, max=13}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=14, count=1, min=14, max=14}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=14, count=1, min=14, max=14}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=276, count=2, min=138, max=138}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_7, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=106, count=2, min=53, max=53}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=15, count=1, min=15, max=15}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=945, count=1, min=945, max=945}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempcFsqyg/artifactsUEDQUs/job_85ae050a-180c-498f-b7f6-7cb29005f5f7/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempcFsqyg/artifactsUEDQUs/job_85ae050a-180c-498f-b7f6-7cb29005f5f7/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1214001103-f18d2ddc_6db74010-b13a-4b6e-b337-b9d0934658d3
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1214001103-f18d2ddc_6db74010-b13a-4b6e-b337-b9d0934658d3
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/14 00:11:27 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44335
19/12/14 00:11:27 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:34539
19/12/14 00:11:27 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:42965
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/14 00:11:29 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214001129-3669e972_17a4756f-e7ba-4e8b-b114-d62d6255e42f
19/12/14 00:11:29 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214001129-3669e972_17a4756f-e7ba-4e8b-b114-d62d6255e42f
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/14 00:11:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/14 00:11:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/14 00:11:29 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/14 00:11:30 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/14 00:11:30 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214001129-3669e972_17a4756f-e7ba-4e8b-b114-d62d6255e42f on Spark master local[4]
19/12/14 00:11:30 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/14 00:11:30 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37511.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/14 00:11:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:35645.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36299
19/12/14 00:11:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/14 00:11:33 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/14 00:11:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214001129-3669e972_17a4756f-e7ba-4e8b-b114-d62d6255e42f: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/14 00:11:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214001129-3669e972_17a4756f-e7ba-4e8b-b114-d62d6255e42f finished.
19/12/14 00:11:49 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/14 00:11:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp04Li_l/artifactshTZh5R/job_3df2d6b3-e5f5-429c-8ad4-f308f0e944e2/MANIFEST has 1 artifact locations
19/12/14 00:11:49 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp04Li_l/artifactshTZh5R/job_3df2d6b3-e5f5-429c-8ad4-f308f0e944e2/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/14 00:11:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1214001129-3669e972_17a4756f-e7ba-4e8b-b114-d62d6255e42f
19/12/14 00:11:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1214001129-3669e972_17a4756f-e7ba-4e8b-b114-d62d6255e42f
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576282309.894680355","description":"Error received from peer ipv4:127.0.0.1:37511","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576282309.894510047","description":"Error received from peer ipv4:127.0.0.1:36299","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576282309.894510047","description":"Error received from peer ipv4:127.0.0.1:36299","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576282309.894538361","description":"Error received from peer ipv4:127.0.0.1:35645","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3315.928s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 23s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/sdhlkshwzeatm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1220

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1220/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-3713] pytest migration: py3x-{gcp,cython}


------------------------------------------
[...truncated 1.48 MB...]
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/13 21:01:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/13 21:01:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/13 21:01:10 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/13 21:01:10 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/13 21:01:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1213210109-9277a7d7_6158c1d6-d4ab-4f6f-9249-580cc3c6f0e8 on Spark master local[4]
19/12/13 21:01:11 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/13 21:01:11 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:35229.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/13 21:01:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:42563.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:33893
19/12/13 21:01:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/13 21:01:14 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/13 21:01:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213210109-9277a7d7_6158c1d6-d4ab-4f6f-9249-580cc3c6f0e8: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/13 21:02:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213210109-9277a7d7_6158c1d6-d4ab-4f6f-9249-580cc3c6f0e8 finished.
19/12/13 21:02:04 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/13 21:02:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempPJup0y/artifactsZF30NC/job_9bd745d1-dcf4-42f7-a773-ec46e75c515c/MANIFEST has 1 artifact locations
19/12/13 21:02:04 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempPJup0y/artifactsZF30NC/job_9bd745d1-dcf4-42f7-a773-ec46e75c515c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/13 21:02:04 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1213210109-9277a7d7_6158c1d6-d4ab-4f6f-9249-580cc3c6f0e8
19/12/13 21:02:04 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1213210109-9277a7d7_6158c1d6-d4ab-4f6f-9249-580cc3c6f0e8
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576270925.163579519","description":"Error received from peer ipv4:127.0.0.1:35229","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576270925.163514156","description":"Error received from peer ipv4:127.0.0.1:33893","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576270925.163514156","description":"Error received from peer ipv4:127.0.0.1:33893","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576270925.163548901","description":"Error received from peer ipv4:127.0.0.1:42563","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3299.631s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_28-1731300365448934694?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_59_43-14113084375125099717?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_06_56-10916315208703637315?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_13_30-16083420514809444201?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_19_50-9713679333524223480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_26_20-13149766220479684216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_33_33-10707565039288380057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_40_29-8743890124943544226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_31-4730838405119156578?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_07_26-3851330925423606162?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_13_32-3151460191932164731?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_20_07-3288994359118813808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_27_04-13742073479639505488?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_33-1323034124004927723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_09_33-16468003207218644879?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_16_47-9165002487596585541?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_23_50-2460518847909363792?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_31-4514552958523185491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_04_41-12878665890114274615?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_11_07-12631143005048704953?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_17_34-14479893698444373198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_24_10-10944439438490492496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_27-2751788522790104182?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_11_33-10540941371214718867?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_17_55-7862209328994358639?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_24_45-3501292795098175419?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_28-7660443407828548232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_59_55-4539106509382211054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_06_40-12477933695929235236?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_14_47-10827371658824355801?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_22_17-18165927578037516549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_28_27-8146364567594991388?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_25-3365284224482275356?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_59_10-9613193383807932728?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_07_55-3386737867369398436?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_15_15-14829399040909781157?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_22_04-4756935216742344042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_52_26-15626280620009590264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_01_54-16229051660716484738?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_13_11_12-11622437702962379337?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 10s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/j7umlpnraforu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1219

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1219/display/redirect?page=changes>

Changes:

[younghoono] [BEAM-7970] Improved error help in Go PROTOBUF.md


------------------------------------------
[...truncated 1.47 MB...]
19/12/13 19:39:09 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:41085
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/13 19:39:10 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1213193910-d283e7e_67581a7d-b53f-4f9c-8597-5da6be1e7fae
19/12/13 19:39:10 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1213193910-d283e7e_67581a7d-b53f-4f9c-8597-5da6be1e7fae
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/13 19:39:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/13 19:39:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/13 19:39:10 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/13 19:39:11 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/13 19:39:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1213193910-d283e7e_67581a7d-b53f-4f9c-8597-5da6be1e7fae on Spark master local[4]
19/12/13 19:39:12 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/13 19:39:12 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:33449.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/13 19:39:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:38703.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:38579
19/12/13 19:39:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/13 19:39:14 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/13 19:39:34 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213193910-d283e7e_67581a7d-b53f-4f9c-8597-5da6be1e7fae: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/13 19:39:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213193910-d283e7e_67581a7d-b53f-4f9c-8597-5da6be1e7fae finished.
19/12/13 19:39:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/13 19:39:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temppBxZ3o/artifactsbq64za/job_064cca30-a1f0-4600-8aef-9b774e46379e/MANIFEST has 1 artifact locations
19/12/13 19:39:51 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temppBxZ3o/artifactsbq64za/job_064cca30-a1f0-4600-8aef-9b774e46379e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/13 19:39:51 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1213193910-d283e7e_67581a7d-b53f-4f9c-8597-5da6be1e7fae
19/12/13 19:39:51 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1213193910-d283e7e_67581a7d-b53f-4f9c-8597-5da6be1e7fae
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576265992.115074579","description":"Error received from peer ipv4:127.0.0.1:38579","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576265992.115074579","description":"Error received from peer ipv4:127.0.0.1:38579","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576265992.115586012","description":"Error received from peer ipv4:127.0.0.1:33449","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576265992.115091571","description":"Error received from peer ipv4:127.0.0.1:38703","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3309.487s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_45-18273688262241444378?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_37_47-1725584516325929232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_44_59-11381819719015222016?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_51_44-17497264918739831606?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_58_12-3833377491306313497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_04_59-1048086239050751323?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_11_51-15601797186179065251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_18_12-3656281767272984138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_50-18430800974843998463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_45_04-4646855181667069459?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_53_14-17942573625293878161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_00_17-2232581035017635074?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_06_46-284764636375647491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_45-6917280649029485361?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_50_37-13596114389508673427?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_56_55-14189528090289118586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_03_49-972130096345667460?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_48-15827249020872005907?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_42_36-11905135976938997710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_48_58-7339410735393627875?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_55_22-4302480287410972902?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_02_28-5968617834990989249?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_45-6255306583912960433?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_46_28-14183688455008792371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_53_59-849952850932173420?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_01_04-15685596227586802681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_44-5386537326389973423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_37_17-6561020396778031052?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_45_58-17189955001813286032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_53_37-9241007557465631624?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_00_26-1877596656777786234?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_44-4904149106917438850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_38_15-6006987881906588026?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_45_37-10607229263166316602?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_52_17-7709245866542775816?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_58_39-371332596622686363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_30_44-5331475939883588500?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_39_01-9402491622371508328?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_48_25-3365182828468915546?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_12_05_39-9622590293529984817?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 9s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/47av3vjop2xqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1218

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1218/display/redirect>

Changes:


------------------------------------------
[...truncated 1.49 MB...]
19/12/13 18:19:33 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:53591
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/13 18:19:35 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1213181935-b4262b2a_6b400dfb-1b9d-4d8c-9034-89e316d82995
19/12/13 18:19:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1213181935-b4262b2a_6b400dfb-1b9d-4d8c-9034-89e316d82995
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/13 18:19:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/13 18:19:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/13 18:19:35 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/13 18:19:36 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/13 18:19:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1213181935-b4262b2a_6b400dfb-1b9d-4d8c-9034-89e316d82995 on Spark master local[4]
19/12/13 18:19:36 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/13 18:19:36 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:36805.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/13 18:19:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:44939.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:45823
19/12/13 18:19:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/13 18:19:39 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/13 18:19:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213181935-b4262b2a_6b400dfb-1b9d-4d8c-9034-89e316d82995: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/13 18:19:55 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213181935-b4262b2a_6b400dfb-1b9d-4d8c-9034-89e316d82995 finished.
19/12/13 18:19:55 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/13 18:19:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempfJprHH/artifactsMgfDDc/job_5cb06f65-f67e-419e-9dc0-22935ca85450/MANIFEST has 1 artifact locations
19/12/13 18:19:55 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempfJprHH/artifactsMgfDDc/job_5cb06f65-f67e-419e-9dc0-22935ca85450/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/13 18:19:55 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1213181935-b4262b2a_6b400dfb-1b9d-4d8c-9034-89e316d82995
19/12/13 18:19:55 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1213181935-b4262b2a_6b400dfb-1b9d-4d8c-9034-89e316d82995
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 126, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576261195.594529805","description":"Error received from peer ipv4:127.0.0.1:36805","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 609, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576261195.594479418","description":"Error received from peer ipv4:127.0.0.1:44939","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576261195.594431344","description":"Error received from peer ipv4:127.0.0.1:45823","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 333, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 318, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1576261195.594431344","description":"Error received from peer ipv4:127.0.0.1:45823","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3372.966s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_47-16832899060951787331?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_19_18-5557707815302850788?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_26_40-14936894831800103901?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_34_04-3417099889410186199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_40_27-5212114052061281574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_48_00-482258338097652728?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_54_56-11025112592177257878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_11_01_14-7771021422971478872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_51-1318432053513083170?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_26_13-5354533510000350026?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_34_37-1473394838605467724?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_42_06-8836296697257129171?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_48_39-16116367108378503751?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_45-6524809324986953683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_28_57-13663303667804106233?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_35_20-10209209174927124492?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_42_41-446455395159687947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_50-13506078545424228861?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_23_53-18439583200197344132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_30_18-1952485063679272480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_36_36-2334086099652807125?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_43_05-9881680029599704934?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_46-4762363533861043098?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_28_41-10329620618146309902?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_35_58-6693577929669955659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_43_03-13716666906479303229?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_45-8198253188797993218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_19_19-16734622158117693645?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_26_55-14185055086957897063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_34_11-14012500222901356276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_41_05-85802738017844002?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_44-4215472079024605264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_18_31-11322508876341898523?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_27_03-1371032462546896578?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_35_32-605684540263114303?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_42_48-17583483430131331064?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_11_46-16104461928456279383?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_19_55-17655864761864653088?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_28_51-16945328327051261701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-13_10_45_47-5623290261222441340?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 16s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/4z5yqzh4bqlfo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org