You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/30 13:10:06 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #589

See <https://builds.apache.org/job/beam_PostCommit_Python2/589/display/redirect>

Changes:


------------------------------------------
[...truncated 770.01 KB...]
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) [DEPLOYING]
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd).
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 14bc917365f0a1777c2290a3dff733cd.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-root-0930121022-f43326e0 (d852a6fe1a153a98d7d679884aa8d5c7) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job d852a6fe1a153a98d7d679884aa8d5c7 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-root-0930121022-f43326e0(d852a6fe1a153a98d7d679884aa8d5c7).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 821e63a4670109eee7416d039dc6decd: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 98c4e4df1df4f0a995a1e2725ef747ac@akka://flink/user/jobmanager_1 for job d852a6fe1a153a98d7d679884aa8d5c7 from the resource manager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647}, allocationId: 740560020ed912e642893a01897cc0f1, jobId: d852a6fe1a153a98d7d679884aa8d5c7).
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job d852a6fe1a153a98d7d679884aa8d5c7 with leader id 98c4e4df1df4f0a995a1e2725ef747ac lost leadership.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647}, allocationId: a867034bf3d761bded23aad50b4c6587, jobId: d852a6fe1a153a98d7d679884aa8d5c7).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job d852a6fe1a153a98d7d679884aa8d5c7 from job leader monitoring.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job d852a6fe1a153a98d7d679884aa8d5c7.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job d852a6fe1a153a98d7d679884aa8d5c7.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job d852a6fe1a153a98d7d679884aa8d5c7 because it is not registered.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed spill file directory /tmp/flink-io-24802c82-f725-4caa-993c-817d3d8890e9
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-c40a046c-3ad6-4881-8899-950f552e213e
[ForkJoinPool.commonPool-worker-11] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:34767
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 13586 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 14, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 2, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 2, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 14, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 0)Distributions(ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Manifest at /tmp/beam-artifact-staging/job_d7cc98fa-4f42-4fb0-875a-2c9d53dfbe96/MANIFEST has 0 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-artifact-staging/job_d7cc98fa-4f42-4fb0-875a-2c9d53dfbe96/
INFO:root:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

======================================================================
ERROR: test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/examples/complete/game/hourly_team_score_it_test.py",> line 89, in test_hourly_team_score_it
    self.test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/examples/complete/game/hourly_team_score.py",> line 303, in run
    }, options.view_as(GoogleCloudOptions).project))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 427, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 484, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 530, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 560, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 490, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 168, in stage_job_resources
    requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 487, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 83, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 574, in check_output
    raise CalledProcessError(retcode, cmd, output=output)
CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1
 
 Pip install failed for package: -r         
 Output from execution of subprocess: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  ERROR: Could not find a version that satisfies the requirement mock (from -r postcommit_requirements.txt (line 2)) (from versions: none)
ERROR: No matching distribution found for mock (from -r postcommit_requirements.txt (line 2))

-------------------- >> begin captured logging << --------------------
root: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/datasets HTTP/1.1" 200 None
root: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://dataflow-samples/game/gaming_data'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://dataflow-samples/game/gaming_data*' -> 'gs\\:\\/\\/dataflow\\-samples\\/game\\/gaming\\_data[^/\\\\]*'
root: INFO: Setting socket default timeout to 60 seconds.
root: INFO: socket default timeout is 60.0econds.
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 2 files in 0.0710110664368 seconds.
root: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://dataflow-samples/game/gaming_data'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://dataflow-samples/game/gaming_data*' -> 'gs\\:\\/\\/dataflow\\-samples\\/game\\/gaming\\_data[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 2 files in 0.054297208786 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/pipeline.pb...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/requirements.txt in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3986.574s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 21s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/6fmiympaqkszi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #594

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/594/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #593

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/593/display/redirect?page=changes>

Changes:

[lostluck] Helper to get the value of a KV type


------------------------------------------
[...truncated 691.73 KB...]
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-09-30T20:46:50.661Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
root: INFO: 2019-09-30T20:46:50.728Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
root: INFO: 2019-09-30T20:46:50.763Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
root: INFO: 2019-09-30T20:46:50.802Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-30T20:46:50.840Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-30T20:46:50.863Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-30T20:46:50.873Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-30T20:46:50.913Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-30T20:46:50.918Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-30T20:46:50.951Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
root: INFO: 2019-09-30T20:46:50.978Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
root: INFO: 2019-09-30T20:46:51.008Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten.out" materialized.
root: INFO: 2019-09-30T20:46:51.046Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-30T20:46:52.229Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-09-30T20:46:52.942Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-09-30T20:46:52.969Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-09-30T20:46:53.040Z: JOB_MESSAGE_DEBUG: Executing failure step failure98
root: INFO: 2019-09-30T20:46:53.074Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S53:WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.,
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.,
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.,
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.
root: INFO: 2019-09-30T20:46:53.599Z: JOB_MESSAGE_WARNING: S29:WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) failed.
root: INFO: 2019-09-30T20:46:53.636Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-30T20:46:53.749Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-30T20:46:54.379Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-30T20:46:54.407Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-30T20:49:20.976Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T20:49:21.014Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-30T20:49:21.048Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-30_13_41_11-18079351449406464958 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3560.654s

FAILED (SKIP=4, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_18-11262061949655352600?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_34_26-4144143286882013219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_41_13-15773746256158000405?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_48_36-3812148319424550034?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_55_43-12644389554144417012?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_03_10-12400468502706147769?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_10_26-479697534740292318?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_17_54-15209654924793024956?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_19-12127745356714157966?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_44_57-4636402299800957687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_02_40-4243833746896103470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_24-2545215373072563376?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_41_11-18079351449406464958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_49_45-3557387648186070139?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_57_40-8269640652522417617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_05_15-16055213729348637551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_23-12183176833803058343?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_39_04-8833938350517598946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_46_19-6215173368569923451?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_52_57-2676182881689520548?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_00_00-2583388801769005969?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_18-4162797980096061055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_43_50-2031423382545935332?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_51_12-11165925138310668381?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_58_39-6114012822787355110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_17-6504666919768180457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_34_16-13171232894320765256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_42_15-3033505486995659633?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_48_33-11406389230598089828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_55_41-2950184232358624408?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_19-6438056568745940327?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_34_08-395964721375571668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_43_38-2805282141519175509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_50_58-1726937768829822425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_58_26-279067626545712361?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_18-16154200092724124292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_35_50-5446163863862887261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_45_17-7939372622562979530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_52_16-3022913811158294446?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_00_07-13934941983528028110?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 22s
109 actionable tasks: 85 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/vpsbqr4z5kt52

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #592

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/592/display/redirect?page=changes>

Changes:

[lukecwik] [BEAM-6923] limit gcs buffer size to 1MB for artifact upload (#9647)


------------------------------------------
[...truncated 556.55 KB...]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) [DEPLOYING]
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) [DEPLOYING].
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/2) (0ee1d302d7fb66b7fccc7ac50cefee67) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (77c9d83a9325cd0505a6c960ff85379a) [DEPLOYING].
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) [DEPLOYING].
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (77c9d83a9325cd0505a6c960ff85379a) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (77c9d83a9325cd0505a6c960ff85379a) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (2/2) (03207cb06439c9967a79819e425fb372) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (2/2) (03207cb06439c9967a79819e425fb372).
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2) (f69d088abf6addc01e24fd1c98348479) switched from RUNNING to FINISHED.
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2) (f69d088abf6addc01e24fd1c98348479).
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2) (e504f9241ff6ddb0270f0c033dc322d2) switched from RUNNING to FINISHED.
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2) (e504f9241ff6ddb0270f0c033dc322d2).
[CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (2/2) (03207cb06439c9967a79819e425fb372) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) 03207cb06439c9967a79819e425fb372.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:377>), Map(<lambda at external_test.py:378>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (2/2) (03207cb06439c9967a79819e425fb372) switched from RUNNING to FINISHED.
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2) (e504f9241ff6ddb0270f0c033dc322d2) [FINISHED]
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2) (f69d088abf6addc01e24fd1c98348479) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) e504f9241ff6ddb0270f0c033dc322d2.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) f69d088abf6addc01e24fd1c98348479.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2) (e504f9241ff6ddb0270f0c033dc322d2) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (attempt #0) to 3b066015-a9c0-46a6-bacd-f13813a06adc @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2) (f69d088abf6addc01e24fd1c98348479) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) d59f69494539f8ff5446032112fd082f.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) (d59f69494539f8ff5446032112fd082f) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (77c9d83a9325cd0505a6c960ff85379a) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (77c9d83a9325cd0505a6c960ff85379a).
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) switched from CREATED to SCHEDULED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (77c9d83a9325cd0505a6c960ff85379a) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) 77c9d83a9325cd0505a6c960ff85379a.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (attempt #0) to 3b066015-a9c0-46a6-bacd-f13813a06adc @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) (77c9d83a9325cd0505a6c960ff85379a) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 2eb0aea6925c9f2170ea1885e448aa90.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (2/2) (attempt #0) to 3b066015-a9c0-46a6-bacd-f13813a06adc @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (2/2).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (2eb0aea6925c9f2170ea1885e448aa90) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) [DEPLOYING]
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) [DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) [DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef).
[DataSink (DiscardingOutput) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) cf4030e72d0788822fd1d89dd7b64cef.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (2/2) (cf4030e72d0788822fd1d89dd7b64cef) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4).
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 8f9dabf8552a09055e3d407d62ab3ab4.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (1/2) (attempt #0) to 3b066015-a9c0-46a6-bacd-f13813a06adc @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (1/2).
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) switched from CREATED to DEPLOYING.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (8f9dabf8552a09055e3d407d62ab3ab4) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) [DEPLOYING]
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47).
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 599c02b9dc4ad11f4ba220a8854fbd47.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (599c02b9dc4ad11f4ba220a8854fbd47) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-root-0930190446-5c0b97f9 (80406c00433edd097d4ad439f8b75eeb) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 80406c00433edd097d4ad439f8b75eeb reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-root-0930190446-5c0b97f9(80406c00433edd097d4ad439f8b75eeb).
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647}, allocationId: 6f4f0874d753b3323dea5e9dde0d58a3, jobId: 80406c00433edd097d4ad439f8b75eeb).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection d59832bc946ceaf2aa2cbac2e4d2923f: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 88f6b3cbed8fbf8de21ff99ef51849e5@akka://flink/user/jobmanager_1 for job 80406c00433edd097d4ad439f8b75eeb from the resource manager.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 80406c00433edd097d4ad439f8b75eeb with leader id 88f6b3cbed8fbf8de21ff99ef51849e5 lost leadership.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647}, allocationId: 488ef06a5143034b301bab9f4a28f7bc, jobId: 80406c00433edd097d4ad439f8b75eeb).
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 80406c00433edd097d4ad439f8b75eeb from job leader monitoring.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 80406c00433edd097d4ad439f8b75eeb.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 80406c00433edd097d4ad439f8b75eeb.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job 80406c00433edd097d4ad439f8b75eeb because it is not registered.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed spill file directory /tmp/flink-io-d9411752-9095-455c-9e3e-9ae07e892256
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-2cf21c75-8405-49c6-ba1f-dbc6a1b15cf8
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:45839
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 19309 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 7, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 2, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 11, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2474>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 2, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 2, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 11, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 2, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 2, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}: 5, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}: 0)Distributions(ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Manifest at /tmp/beam-artifact-staging/job_0de2d8dd-0a60-481a-8f03-48725b7a908f/MANIFEST has 0 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-artifact-staging/job_0de2d8dd-0a60-481a-8f03-48725b7a908f/
INFO:root:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3806.172s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 4m 28s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/dxhy3kbxr56gu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #591

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/591/display/redirect?page=changes>

Changes:

[ihr] Add clarification about authorized views


------------------------------------------
[...truncated 1.15 MB...]
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "add_attribute.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5", 
        "user_name": "add_attribute"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s4", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "to_proto_str"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s3"
        }, 
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7", 
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s5", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s4"
        }, 
        "pubsub_id_label": "id", 
        "pubsub_serialized_attributes_fn": "", 
        "pubsub_timestamp_label": "timestamp", 
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output3709e04e-e239-4492-abe9-104e903f9423", 
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: u'2019-09-30T18:21:49.835989Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-09-30_11_21_48-16531321609146972450'
 location: u'us-central1'
 name: u'beamapp-jenkins-0930182139-324092'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-09-30T18:21:49.835989Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-09-30_11_21_48-16531321609146972450]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_21_48-16531321609146972450?project=apache-beam-testing
root: INFO: Job 2019-09-30_11_21_48-16531321609146972450 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-30T18:21:54.364Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T18:21:55.107Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
root: INFO: 2019-09-30T18:21:57.931Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-09-30T18:21:57.935Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-09-30T18:21:57.947Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T18:21:57.961Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-09-30T18:21:57.965Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-09-30T18:21:57.969Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-30T18:21:57.993Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T18:21:57.997Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromPubSub/Map(_from_proto_str) into ReadFromPubSub/Read
root: INFO: 2019-09-30T18:21:58.003Z: JOB_MESSAGE_DETAILED: Fusing consumer add_attribute into ReadFromPubSub/Map(_from_proto_str)
root: INFO: 2019-09-30T18:21:58.006Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into add_attribute
root: INFO: 2019-09-30T18:21:58.009Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
root: INFO: 2019-09-30T18:21:58.021Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_input3709e04e-e239-4492-abe9-104e903f9423 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
root: INFO: 2019-09-30T18:21:58.025Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-30T18:21:58.082Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-30T18:21:58.100Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T18:21:58.251Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-09-30T18:21:58.285Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-30T18:21:58.293Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-09-30T18:22:01.190Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_input3709e04e-e239-4492-abe9-104e903f9423'.
root: INFO: 2019-09-30T18:22:01.768Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+ReadFromPubSub/Map(_from_proto_str)+add_attribute+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-09-30T18:22:33.328Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
root: INFO: 2019-09-30T18:22:33.665Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T18:22:34.899Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
root: INFO: 2019-09-30T18:22:48.780Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: WARNING: Timing out on waiting for job 2019-09-30_11_21_48-16531321609146972450 after 182 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3664.981s

FAILED (SKIP=4, errors=2)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_38-16901419110014889957?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_57_49-7475105659030011096?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_05_32-4496914957301354213?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_13_24-2771102062295911994?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_20_56-13339898215555619437?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_27_59-13051205535088924831?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_35_38-10724745329956855114?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_44_22-11241545398091244088?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_44-16930092108876277932?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_05_14-8056241579649777046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_13_14-7312506171381120591?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_20_41-2344562959471592228?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_38-15037812494594669537?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_10_59-7564218317427133438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_17_37-1512064614793770505?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_24_42-1794042521161386848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_42-14327640381037732945?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_03_22-5471397682867019160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_10_27-299608755672028772?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_17_20-15002310708739009749?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_24_21-5123581147704754927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_38-14983526234444667929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_07_49-2165726834858822675?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_15_36-9931262432989040304?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_23_20-15331977806657291349?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_38-10598481877107188419?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_57_36-14108819349431721360?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_06_37-3828655370825768242?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_15_00-17047650672349280918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_21_48-16531321609146972450?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_38-13003976029292357832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_58_42-4658144704976022142?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_06_32-10576254895279869210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_14_01-5351732948225115710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_21_04-15719734791598802587?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_28_07-2266488782136667446?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_50_38-14293562955019993334?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_10_59_31-2687668927035226649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_09_08-3589369332705434588?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_11_26_17-3061226627815900507?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 5s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/xh5btytf2pzf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #590

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/590/display/redirect?page=changes>

Changes:

[kirillkozlov] [BEAM-8275] Beam SQL should support BigQuery in DIRECT_READ mode

[github] Addressed review comments

[github] Added a test for BigQuery SQL read in EXPORT mode


------------------------------------------
[...truncated 1.50 MB...]
      "major": "7"
    }, 
    "workerPools": [
      {
        "autoscalingSettings": {}, 
        "kind": "harness", 
        "numWorkers": 1, 
        "packages": [
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/requirements.txt", 
            "name": "requirements.txt"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.1.0.zip", 
            "name": "setuptools-41.1.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/PyHamcrest-1.9.0.tar.gz", 
            "name": "PyHamcrest-1.9.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/mock-3.0.5.tar.gz", 
            "name": "mock-3.0.5.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.1.0.post1.tar.gz", 
            "name": "setuptools-41.1.0.post1.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.0.1.zip", 
            "name": "setuptools-41.0.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/six-1.12.0.tar.gz", 
            "name": "six-1.12.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/funcsigs-1.0.2.tar.gz", 
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.2.0.zip", 
            "name": "setuptools-41.2.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190802"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0930165314-347896", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15698623933884", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"name\": \"fruit\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-09-30T16:53:24.826061Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-09-30_09_53_23-11518054275188141166'
 location: u'us-central1'
 name: u'beamapp-jenkins-0930165314-347896'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-09-30T16:53:24.826061Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-30_09_53_23-11518054275188141166]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_09_53_23-11518054275188141166?project=apache-beam-testing
root: INFO: Job 2019-09-30_09_53_23-11518054275188141166 is in state JOB_STATE_RUNNING
root: INFO: 2019-09-30T16:53:23.661Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-09-30_09_53_23-11518054275188141166.
root: INFO: 2019-09-30T16:53:23.661Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-09-30_09_53_23-11518054275188141166. The number of workers will be between 1 and 1000.
root: INFO: 2019-09-30T16:53:26.790Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T16:53:27.540Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-09-30T16:53:28.237Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T16:53:28.292Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T16:53:28.322Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-30T16:53:28.358Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-09-30T16:53:28.507Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T16:53:28.539Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into read
root: INFO: 2019-09-30T16:53:28.566Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-30T16:53:28.601Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-30T16:53:28.637Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-30T16:53:28.671Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T16:53:28.794Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2019-09-30T16:53:28.868Z: JOB_MESSAGE_BASIC: Executing operation read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-09-30T16:53:28.907Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-30T16:53:28.945Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-09-30T16:53:31.042Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_18190909499168834354". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_18190909499168834354".
root: INFO: 2019-09-30T16:53:55.076Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T16:54:32.963Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T16:54:32.994Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-30T16:55:13.747Z: JOB_MESSAGE_BASIC: BigQuery query completed, job : "dataflow_job_18190909499168834354"
root: INFO: 2019-09-30T16:55:14.143Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_15720953621548940030" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_15720953621548940030".
root: INFO: 2019-09-30T16:55:44.620Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_15720953621548940030" observed total of 1 exported files thus far.
root: INFO: 2019-09-30T16:55:44.662Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_15720953621548940030"
root: INFO: 2019-09-30T16:58:05.913Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_18190909499168835932". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_18190909499168835932".
root: INFO: 2019-09-30T16:59:28.778Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-30T17:05:28.778Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: WARNING: Timing out on waiting for job 2019-09-30_09_53_23-11518054275188141166 after 904 seconds
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4181.793s

FAILED (SKIP=4, errors=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 45s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/m3bqcglj4gbeg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org