You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/25 07:35:29 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #1071

See <https://builds.apache.org/job/beam_PostCommit_Python2/1071/display/redirect>

Changes:


------------------------------------------
[...truncated 953.50 KB...]
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 95be55cfe4b67b7c4abed2f4b503c137 with leader id 8091d196e26110b15cf4ddfba5d245cb lost leadership.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8138}, allocationId: c24f9062a0d5d5fedbcb1fdc0fbccc94, jobId: 95be55cfe4b67b7c4abed2f4b503c137).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 95be55cfe4b67b7c4abed2f4b503c137 from job leader monitoring.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 95be55cfe4b67b7c4abed2f4b503c137.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 95be55cfe4b67b7c4abed2f4b503c137.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job 95be55cfe4b67b7c4abed2f4b503c137 because it is not registered.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection 0a792b072bd8b29286ed1091b38f7a66.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection 5c620a31-5e79-4500-9a23-cfbc07040111 because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-63b65fce-f306-4e2b-84c5-e7e3e58063fb
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-f7103581-6767-49f5-a1f4-ca60d245ff7d
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-13c70587-f0da-45ae-b162-fb73df4d2991
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:38383
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 15466 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 8, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 2, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 7, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 2, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 7, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 2, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 8, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 2, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-artifact-staging/job_de4624b1-ae2c-4ed7-912a-fdcc4985f8c8/MANIFEST has 0 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-artifact-staging/job_de4624b1-ae2c-4ed7-912a-fdcc4985f8c8/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5486.808s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 39s
115 actionable tasks: 89 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/gf3yzc5kpqjvg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #1122

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1122/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python2 - Build # 1121 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python2 (build #1121)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python2/1121/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python2 #1120

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1120/display/redirect?page=changes>

Changes:

[lostluck] [BEAM-2929] Fix go code format for


------------------------------------------
[...truncated 2.00 MB...]
	details = "Socket closed"
	debug_error_string = "{"created":"@1575337109.356450950","description":"Error received from peer ipv4:127.0.0.1:33755","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575337109.356933237","description":"Error received from peer ipv4:127.0.0.1:44781","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_43-2896518157479954031?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_35_41-18353514639182345752?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_44_14-15907968221985319336?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1470, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1410, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1366, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 620, in __ProcessHttpResponse
    return self.__client.DeserializeMessage(response_type, content)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 446, in DeserializeMessage
    message = encoding.JsonToMessage(response_type, data)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 123, in JsonToMessage
    return _ProtoJsonApiTools.Get().decode_message(message_type, message)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 287, in __decode_dictionary
    for item in value]
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 331, in decode_field
    field.message_type, json.dumps(value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 213, in decode_message
    dictionary = json.loads(encoded_message)
  File "/usr/lib/python2.7/json/__init__.py", line 336, in loads
    if (cls is None and encoding is None and object_hook is None and
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5498.400s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_47-8168167603182467418?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_42_26-6002007972115981440?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_49_46-12959813165303241844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_56_56-8707878326218607490?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_04_22-7077886059396695739?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_11_41-12310939640661425973?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_19_09-15321893913967035788?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_43-14334900171218608179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_48_11-14486413400357063989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_55_47-12124152288134325597?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_02_41-18002336912883793815?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_45-17898357538188094880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_40_20-14391643642390154692?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_47_28-6415928652066814078?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_54_39-17203340848955198506?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_01_31-15056177102147427018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_42-15273352003667828999?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_47_09-15343569968332835189?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_04_20-12872153953301270810?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_41-11552983242905546013?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_34_52-4858239842476737114?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_43_33-3625719987800561903?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_51_14-936690857299616804?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_58_05-6433287054717848384?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_04_58-18415968408868608097?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_43-10591138689208292269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_35_40-14214722263832413421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_43_48-506271098145751473?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_49_48-14525067904351491697?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_57_00-14720515194287110530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_27_43-3379382982531865921?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_36_38-11416475603458042589?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_46_07-14084739272868763057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_17_53_47-12980766361881856252?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_00_58-15810271015241916451?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_18_07_22-16019937223580390580?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 49s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/adrehjqphmhli

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1119

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1119/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-4132] Set multi-output PCollections types to Any

[migryz] Bump Release Build Timeout

[migryz] fix syntax

[github] Bump time to 5 hours.

[robertwb] [BEAM-8645] A test case for TimestampCombiner. (#10081)

[lcwik] [BEAM-2929] Ensure that the Beam Java SDK sends the property

[lcwik] [BEAM-2929] Ensure that the Beam Go SDK sends the property

[lcwik] [BEAM-2929] Ensure that the Beam Python SDK sends the property


------------------------------------------
[...truncated 2.02 MB...]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f7b8c6e0d70> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7f7b8c6e0de8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f7b8c6e0e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f7b8c6e0f50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f7b8c6af050> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f7b8c6af0c8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f7b8c6af140> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f7b8c6af2a8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f7b8c6af320> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f7b8c6af398> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f7b7d19f6d0> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/1)))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/0))))))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 21.496 seconds
mongoioit27306
mongoioit27306

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:731: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_22-380223751460891539?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_47_19-14505066945653927091?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_55_46-13272375979129983244?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1410, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1366, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5494.661s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_27-12939095301948310067?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_53_45-12672503602834583718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_02_20-17410194822787099039?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_09_19-6725242131571332983?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_16_59-15891367527597571690?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_24_23-12080556210023750298?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_32_02-17843349604070292478?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_22-9913770932879764723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_00_16-1660091942379218532?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_07_34-17522948776989997463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_15_21-13322996603990327467?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_24-14814748581502174497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_52_04-3928379783195829255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_59_05-4188329308244292850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_06_05-6171617822230798902?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_13_21-7532607642441383940?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_22-7597836062510690952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_57_26-14807475469272984784?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_07_16-9690408078079161691?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_14_20-7321464031649040174?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_21_51-12253876731451975238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_22-900370651316494886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_46_14-6034373802058186996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_55_16-51188561705408414?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_03_49-633023303266562485?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_11_33-13957565274073531256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_24-2847043142846327820?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_47_35-2763320151096541621?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_55_26-14552840905087575645?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_02_12-12091511080308154671?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_08_48-2985500717497409837?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_16_44-10651971764363941659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_39_23-1346093185166450443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_48_08-5489429428582719929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_15_57_58-12730536559670474575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_16_15_48-6473902978783651277?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 17s
120 actionable tasks: 95 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/5gjbr2ywj3yby

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1118

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1118/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8733]  Handle the registration request synchronously in the Python


------------------------------------------
[...truncated 2.08 MB...]
          "output_name": "out", 
          "step_name": "s18"
        }, 
        "serialized_fn": "<string of 1340 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-12-02T21:13:11.454327Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-12-02_13_13_09-6568729113300381307'
 location: u'us-central1'
 name: u'beamapp-jenkins-1202211257-778953'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-12-02T21:13:11.454327Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-02_13_13_09-6568729113300381307]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_13_09-6568729113300381307?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-02_13_13_09-6568729113300381307 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:09.423Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-12-02_13_13_09-6568729113300381307.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:09.423Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-12-02_13_13_09-6568729113300381307. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:13.284Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:14.302Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.003Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.021Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.091Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.124Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.219Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.338Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.365Z: JOB_MESSAGE_DETAILED: Fusing consumer row to string into read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.398Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/KeyWithVoid into row to string
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.433Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial into count/CombineGlobally(CountCombineFn)/KeyWithVoid
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.461Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.492Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.512Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.534Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.553Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/UnKey into count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.588Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15 for input s13.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.614Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.643Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.669Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.695Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.720Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.743Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15-u31 for input s16-reify-value9-c29
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.773Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.791Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.824Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.852Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.882Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into count/CombineGlobally(CountCombineFn)/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.908Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.937Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.974Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.012Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.037Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.070Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.102Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.422Z: JOB_MESSAGE_DEBUG: Executing wait step start41
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.503Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.540Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.553Z: JOB_MESSAGE_BASIC: Executing operation count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.574Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.615Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.628Z: JOB_MESSAGE_BASIC: Finished operation count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.677Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.713Z: JOB_MESSAGE_DEBUG: Value "count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.758Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.800Z: JOB_MESSAGE_BASIC: Executing operation read+row to string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:17.400Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_15449188239845921278" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_15449188239845921278".
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:47.744Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_15449188239845921278" observed total of 1 exported files thus far.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:47.777Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_15449188239845921278"
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:51.055Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.823Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 1250.0 in region us-central1.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.861Z: JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.914Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.914Z: JOB_MESSAGE_BASIC: Finished operation read+row to string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:06.030Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:06.146Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:06.162Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:25.235Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:25.259Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-02_13_13_09-6568729113300381307 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 620, in __ProcessHttpResponse
    return self.__client.DeserializeMessage(response_type, content)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 446, in DeserializeMessage
    message = encoding.JsonToMessage(response_type, data)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 123, in JsonToMessage
    return _ProtoJsonApiTools.Get().decode_message(message_type, message)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 287, in __decode_dictionary
    for item in value]
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 331, in decode_field
    field.message_type, json.dumps(value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 310, in decode_message
    result = _ProcessUnknownEnums(result, encoded_message)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 531, in _ProcessUnknownEnums
    decoded_message = json.loads(six.ensure_str(encoded_message))
  File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 380, in raw_decode
    obj, end = self.scan_once(s, idx)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5507.761s

FAILED (SKIP=5, errors=2)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_19-4730468962750505463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_12_23-119448814224082348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_14_49-11999213596418577564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_32_18-9248323002065203603?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_17-4584725342455771544?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_17_33-1305790076117006026?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_24_53-11059203174390592472?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_32_08-1150938119988014425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_18-2095185521590466840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_09_44-6449607306231753708?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_16_51-9674344046618286839?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_44-1418816685479814778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_31_41-4175264225905358873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_16-2094587883804841407?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_05_02-15459944219501791050?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_13_09-6568729113300381307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_15_48-12675993783668797328?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_17-2215698956183388847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_31_42-1321765033105109363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_38_56-15047909077508743349?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_46_30-16011753843657935912?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_16-1344611298550093374?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_04_43-14662117601106516085?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_14_27-10521102793740008566?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_22_13-14926056476407583134?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_29_19-3993886651928485724?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_36_04-4422004491971162823?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_17-11020297155458256049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_05_49-13456258886180144807?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_13_40-10808604804372799222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_21_12-17981342819038252730?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_27_58-5772905708610951996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_16-14423576763023242116?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_05_56-17199531945120722342?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_15_47-3545761448411806136?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_09-14250741536215217952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_30_58-3820199892334211447?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 46s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/qe2fr54tpgxrm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1117

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1117/display/redirect>

Changes:


------------------------------------------
[...truncated 1.80 MB...]
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575310508.011136740","description":"Error received from peer ipv4:127.0.0.1:42509","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575310508.011252888","description":"Error received from peer ipv4:127.0.0.1:42699","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575310508.011207955","description":"Error received from peer ipv4:127.0.0.1:45297","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5446.327s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 31m 52s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/mzyrbmp3coxvq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1116

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1116/display/redirect>

Changes:


------------------------------------------
[...truncated 2.03 MB...]
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempEdA7w_/artifactsMrPagG/job_779d25bd-dae3-42e5-a054-fee99eccaaa7/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1202132939-a2a93929_77f01cd9-c0a2-40aa-92c9-1c6d801cd0c1
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1202132939-a2a93929_77f01cd9-c0a2-40aa-92c9-1c6d801cd0c1
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/12/02 13:30:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:38979
19/12/02 13:30:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:46201
19/12/02 13:30:49 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:42897
WARNING:root:Waiting for grpc channel to be ready at localhost:42897.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/02 13:30:51 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1202133051-43df1a43_9e23aee1-2836-4d99-9636-9c9804ccf1e9
19/12/02 13:30:51 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1202133051-43df1a43_9e23aee1-2836-4d99-9636-9c9804ccf1e9
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/12/02 13:30:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/12/02 13:30:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/12/02 13:30:52 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/12/02 13:30:54 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/12/02 13:30:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1202133051-43df1a43_9e23aee1-2836-4d99-9636-9c9804ccf1e9 on Spark master local[4]
19/12/02 13:30:56 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/12/02 13:30:56 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:40633.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/12/02 13:31:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40483.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:43549
19/12/02 13:31:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 13:31:01 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/12/02 13:31:30 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1202133051-43df1a43_9e23aee1-2836-4d99-9636-9c9804ccf1e9: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/02 13:31:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1202133051-43df1a43_9e23aee1-2836-4d99-9636-9c9804ccf1e9 finished.
19/12/02 13:31:58 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 13:31:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temptzRH9F/artifactss1oSRH/job_6161bc2f-aefd-4893-932e-88ba8830c790/MANIFEST has 1 artifact locations
19/12/02 13:31:58 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temptzRH9F/artifactss1oSRH/job_6161bc2f-aefd-4893-932e-88ba8830c790/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/02 13:31:58 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1202133051-43df1a43_9e23aee1-2836-4d99-9636-9c9804ccf1e9
19/12/02 13:31:58 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1202133051-43df1a43_9e23aee1-2836-4d99-9636-9c9804ccf1e9
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575293519.258689491","description":"Error received from peer ipv4:127.0.0.1:40483","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575293519.258695992","description":"Error received from peer ipv4:127.0.0.1:40633","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575293519.258649384","description":"Error received from peer ipv4:127.0.0.1:43549","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575293519.258649384","description":"Error received from peer ipv4:127.0.0.1:43549","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_14-6278760345176488085?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_16_53-15380731864994675933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_24_46-2504958118559138873?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test suite for <class 'apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests'>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/suite.py",> line 177, in __call__
    return self.run(*arg, **kw)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 822, in run
    test.config.plugins.addError(test,err)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/manager.py",> line 99, in __call__
    return self.call(*arg, **kw)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/manager.py",> line 167, in simple
    result = meth(*arg, **kw)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/xunit.py",> line 287, in addError
    tb = format_exception(err, self.encoding)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/pyversion.py",> line 214, in format_exception
    ''.join(traceback.format_exception(*exc_info)),
  File "/usr/lib/python2.7/traceback.py", line 141, in format_exception
    list = list + format_tb(tb, limit)
  File "/usr/lib/python2.7/traceback.py", line 76, in format_tb
    return format_list(extract_tb(tb, limit))
  File "/usr/lib/python2.7/traceback.py", line 101, in extract_tb
    line = linecache.getline(filename, lineno, f.f_globals)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/lib/python2.7/linecache.py",> line 14, in getline
    lines = getlines(filename, module_globals)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/lib/python2.7/linecache.py",> line 41, in getlines
    return updatecache(filename, module_globals)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/lib/python2.7/linecache.py",> line 132, in updatecache
    lines = fp.readlines()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: '<nose.plugins.multiprocess.NoSharedFixtureContextSuite context=BigQueryStreamingInsertTransformIntegrationTests>'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5495.832s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_18-7904350070443688309?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_23_10-2173333601179733897?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_31_11-16077709667909561059?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_37_46-13886407997388846644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_45_21-14081241619877455582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_52_55-13165115216165621033?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_59_59-8261334257477204770?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_14-4119904330565038063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_27_34-5332664651990505368?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_34_24-3663530673440977212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_41_17-3750813203213705615?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_16-226000092014953341?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_20_42-13736251037763750279?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_27_39-15368087197258814679?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_35_00-15214977253026549982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_42_07-1324197345787735487?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_14-8068736207139889159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_26_55-8498810401572233667?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_34_38-3909870563291537781?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_42_09-18077852163037853430?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_49_23-9202818139944147566?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_14-7998842940012040531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_16_53-6910924176331696847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_24_29-630789220544336288?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_31_35-6594398844466356214?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_39_01-6723114402706179369?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_13-1213644584627153442?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_14_54-328723380319134748?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_24_15-9566250965753854462?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_32_07-14939344324806858982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_39_08-9404775821032241051?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_46_14-14846294776730670634?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_08_14-7580079793348676500?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_17_33-11712747148527562179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_27_33-13124256302963956924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_04_44_58-11720383703133465147?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 42s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/qao22fetligxe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1115

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1115/display/redirect>

Changes:


------------------------------------------
[...truncated 2.01 MB...]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f5401137ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f5401153050> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f54011530c8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f5401153140> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f54011531b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f5401153320> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f5401153398> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f5401153410> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f53f2012ad0> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 21.362 seconds
mongoioit27878
mongoioit27878

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_01-16623562313919615687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_15_55-6472920145461149815?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_24_12-1989983035250794466?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5521.426s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_07_35-12345086230523507194?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_22_01-917423121460858744?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_30_53-2945993314517816467?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_38_07-2905243631350403936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_46_31-14939299977003695186?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_53_59-14258284568664800925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_23_01_33-11090494424267265087?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_07_29-6938622401290959507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_25_30-2749871682612054271?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_43_45-8353574926146564255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_07_33-8343510047691699010?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_20_22-4323725664373549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_27_33-15455766786504756819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_35_46-2408069878275730950?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_42_41-1753096602387414090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_07_29-14298668968352387181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_23_30-1103988145583484196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_32_08-13271267532796066703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_40_08-298352651150429334?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_07_29-17466574724507587949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_15_11-13033554290365221745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_24_31-14972081011325138416?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_33_29-15849590671593286619?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_41_23-9791629954872607388?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_49_32-16465500065989183983?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_07_29-97944559638397154?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_15_12-11899735698257549158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_24_03-9178121899504192177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_31_22-5597859051907478861?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_39_27-17317977829652210610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_48_07-13233751439203930573?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_07_29-3733866604869128438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_16_02-15630069021971510359?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_26_41-8427652105700280854?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_34_36-9679871309989607753?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_41_28-13239788778909293341?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 12s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/c64bqkhrio7xs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1114

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1114/display/redirect>

Changes:


------------------------------------------
[...truncated 2.29 MB...]
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }
                      ], 
                      "is_pair_like": true, 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s24"
        }, 
        "serialized_fn": "ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1_39", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s42", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }
                      ], 
                      "is_pair_like": true, 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s29"
        }, 
        "serialized_fn": "ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2_40", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-12-02T00:11:09.235302Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-12-01_16_11_07-16386268534534974609'
 location: u'us-central1'
 name: u'beamapp-jenkins-1202001055-350763'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-12-02T00:11:09.235302Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-01_16_11_07-16386268534534974609]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_11_07-16386268534534974609?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5412.222s

FAILED (SKIP=5, errors=3, failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_02_56-10890124138042660512?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_20_17-427267881106203918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_27_02-7646790433337150004?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_33_44-7474732033163054267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_03_01-10879013246212006931?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_15_20-1565631068827571299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_17_30-238797734663685073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_24_17-14920079709545223654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_32_02-15911478773524229339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_02_58-1523655358913835985?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_21_24-3937962147587254002?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_28_34-8816094980590700556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_36_08-16586227279942847068?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_43_27-3393768073358715526?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_50_31-15700187856135018200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_02_58-14909571163592509946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_11_29-5971396986130201553?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_19_54-11773386062067573982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_37_15-13598959168724593084?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_02_58-17717192554900843564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_10_32-17224672862635264695?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_19_28-11088071644612130756?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_27_32-10843351271415419407?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_35_03-6535416106357812555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_41_52-7427667255447032209?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_02_59-12481106130439644802?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_11_07-16386268534534974609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_11_33-11102685935679320138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_17_54-3830921419590101200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_24_09-2819930289645709096?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_30_55-12748124274305885371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_37_31-11898030991982473630?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_02_58-17379328512816375466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_12_13-10942678394331377127?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_14_25-5987364543432979339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_22_35-13583460899711809947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_16_30_18-17774427004161823210?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 31m 25s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/5e3rasincgqus

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1113

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1113/display/redirect>

Changes:


------------------------------------------
[...truncated 1.88 MB...]
              }
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)", 
        "windowing_strategy": "%0AB%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01"
      }
    }, 
    {
      "kind": "CollectionToSingleton", 
      "name": "SideInput-s20", 
      "properties": {
        "output_info": [
          {
            "encoding": {
              "@type": "kind:stream", 
              "component_encodings": [
                {
                  "@type": "kind:windowed_value", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }, 
                    {
                      "@type": "kind:global_window"
                    }
                  ], 
                  "is_wrapper": true
                }
              ], 
              "is_stream_like": {
                "value": true
              }
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s17"
        }, 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)", 
        "windowing_strategy": "%0AB%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_finalize_write"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {
          "side0-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s18"
          }, 
          "side1-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s19"
          }, 
          "side2-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s20"
          }
        }, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 2376 bytes>", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-12-01T18:11:34.824214Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-12-01_10_11_33-5056837705875729519'
 location: u'us-central1'
 name: u'beamapp-jenkins-1201181121-419996'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-12-01T18:11:34.824214Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-01_10_11_33-5056837705875729519]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_11_33-5056837705875729519?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5475.107s

FAILED (SKIP=5, errors=1, failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_03_09-6383290503137641399?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_18_08-7561676218887659170?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_26_07-829278522065801910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_33_43-6334897893003183296?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_03_03-40885345896083927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_23_55-1120500787017996377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_30_34-2501154775401029289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_37_48-16851573287471655346?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_45_07-8370832760581105996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_52_31-18157657563958034465?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_03_07-7085886236405716214?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_15_55-259390996701785622?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_22_58-12344915467736725491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_30_04-15178319562206117926?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_37_11-15521484404669535843?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_03_05-16094079208034215557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_19_12-1450459986578312098?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_26_47-16966992429655488311?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_34_11-2155887833442216744?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_41_11-7336479573525112469?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_03_04-9882578486991424393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_10_36-7699049372379993770?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_18_18-8411286341597940732?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_24_54-17584654191028158551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_32_14-7439443107765611321?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_03_06-17967191436057291533?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_11_33-5056837705875729519?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_11_57-7273657061409962907?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_18_39-1083986267238062551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_25_15-14145494372936415196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_32_06-17919291910157039157?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_38_35-2081335872962375852?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_03_05-9212938042556889885?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_12_06-5909123916517908668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_22_46-10338854591106082442?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_10_40_18-24005318075071266?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 24s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/tnbevn5oswvbq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1112

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1112/display/redirect>

Changes:


------------------------------------------
[...truncated 1.82 MB...]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7ff7a25c6de8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7ff7a25c6e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7ff7a25c6ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7ff7a2674050> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7ff7a26740c8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7ff7a2674140> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7ff7a26741b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7ff7a2674320> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7ff7a2674398> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7ff7a2674410> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7ff799322a50> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/1)))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/0))))))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 21.551 seconds

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_45-16402302885765276074?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_11_36-5019442744347618304?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_22_10-5134834147530628520?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5613.080s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_51-1525552584817832213?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_19_07-1847884036600020308?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_27_58-3391265229082800533?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_34_47-14887986811230203726?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_42_40-3792148454971688797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_50_23-4935309458649714854?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_57_41-10403852920685444073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_43-8152756475791754885?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_22_36-11763138300972920768?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_31_31-459832814598985420?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_39_20-8855873252806757075?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_46_55-9463063463084789466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_51-6032445176109824943?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_17_10-6772459849369690377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_25_23-111144796505894945?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_33_27-14735346856329114217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_40_57-9588627538719657869?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_45-16428940482860915054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_24_10-4769377990617013054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_42_35-14582174349274644846?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_45-12922616334689177849?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_12_16-13553804938981245596?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_20_47-13552367779470544682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_27_43-5723578377681054146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_34_49-800646745759556288?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_44_07-3838926347733632466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_44-2263465239136372958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_11_41-15799094903106357890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_20_04-9269146865908431882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_28_53-8047792785163901401?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_37_28-17188026365429691673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_03_45-8612853147815344848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_13_14-9124211953339469634?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_24_22-11992219773628258474?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_32_13-2276493608567933111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_04_40_36-5644633152406332865?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 35m 3s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/nq5t32ir2amts

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1111

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1111/display/redirect>

Changes:


------------------------------------------
[...truncated 2.18 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/setuptools-42.0.1.zip", 
            "name": "setuptools-42.0.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/setuptools-41.5.0.zip", 
            "name": "setuptools-41.5.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/setuptools-41.0.1.zip", 
            "name": "setuptools-41.0.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/six-1.12.0.tar.gz", 
            "name": "six-1.12.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/funcsigs-1.0.2.tar.gz", 
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/setuptools-41.5.1.zip", 
            "name": "setuptools-41.5.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/setuptools-41.2.0.zip", 
            "name": "setuptools-41.2.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1201062544-344419.1575181544.344542/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20191112"
      }
    ]
  }, 
  "name": "beamapp-jenkins-1201062544-344419", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)", 
        "bigquery_use_legacy_sql": false, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15751815431224", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"name\": \"fruit\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-12-01T06:25:56.945654Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-11-30_22_25_55-4358891219734577067'
 location: u'us-central1'
 name: u'beamapp-jenkins-1201062544-344419'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-12-01T06:25:56.945654Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-11-30_22_25_55-4358891219734577067]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_25_55-4358891219734577067?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5486.020s

FAILED (SKIP=5, errors=1, failures=5)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_03_12-5698440254567921154?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_17_55-1333242670236508644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_25_28-18251693214878064500?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_33_15-5431548631099997001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_03_08-4880443722455906576?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_23_20-7149382004345942707?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_03_10-887803711883590853?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_16_33-15908909646061982823?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_24_29-10988039919126041617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_25_27-13403140720021575425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_25_54-7667403857240398722?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_26_20-12088004299088578984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_03_07-16973500381829079171?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_19_51-4468197808245665461?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_28_02-2762104152031372626?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_34_46-12577289122583819319?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_03_08-6955061182388437584?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_10_54-9161787172778117146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_11_33-333269916112764160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_19_04-4306552753710894259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_25_55-4358891219734577067?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_26_26-16115131206836758240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_33_16-8468763222558077438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_03_07-7200782177483645424?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_11_00-10270831909303508148?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_19_11-10284569847346620001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_26_06-14494991882577505239?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_03_08-11911524293853076317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_13_23-4584368585250411060?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_23_34-4245248468649578637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_31_04-13016679632721900452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_22_38_55-17169813238839271570?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 44s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/df4uoxfhqcb7k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1110

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1110/display/redirect>

Changes:


------------------------------------------
[...truncated 2.03 MB...]
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575159310.704675025","description":"Error received from peer ipv4:127.0.0.1:41505","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 21.880 seconds
mongoioit27102
mongoioit27102

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_12-17786421687269693212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_12_25-11285233033996361948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_22_07-1799156243180845213?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1449, in cancel
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1764, in _conn_request
    content = response.read()
  File "/usr/lib/python2.7/httplib.py", line 603, in read
    s = self.fp.read()
  File "/usr/lib/python2.7/httplib.py", line 1430, in read
    return s + self._file.read()
  File "/usr/lib/python2.7/socket.py", line 355, in read
    data = self._sock.recv(rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5521.274s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_16-6467877567333296845?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_20_38-7434132014257307579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_28_41-10097761293736566637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_35_34-8169559905419027183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_42_42-12367353966688764995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_49_49-10058709585314859998?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_57_59-1002579957490473208?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_16-17716807616290480513?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_23_46-14936357124677069519?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_41_25-16636078342274762530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_14-16631853575695221405?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_18_44-8335683430576046907?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_25_20-4910650138027380196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_31_55-9978223248988156403?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_39_59-5407875979122275468?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_14-5317591502367580822?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_22_15-2320154054427333088?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_29_54-355537181170687429?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_37_11-10624548951762858780?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_12-11143160916558744153?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_13_23-17889999353596872461?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_21_55-14385468542923851841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_28_36-11670142136312030707?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_34_56-193289381710522366?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_43_09-9188560605124866579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_13-16735812805898681996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_14_00-17688835574870303742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_21_50-16681353169389533391?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_30_22-2956914746250206369?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_37_20-13303310582129246504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_44_29-2862285588081998711?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_05_14-9446774091280278924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_14_44-13047586212866809937?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_26_02-17961550272466142718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_33_06-15036635629392133005?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_16_39_38-14699367596489714989?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 21s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/37fcwnsnon56m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1109

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1109/display/redirect>

Changes:


------------------------------------------
[...truncated 1.79 MB...]
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575137434.871119146","description":"Error received from peer ipv4:127.0.0.1:40797","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
INFO:apache_beam.runners.portability.fn_api_runner:Running ((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16))
INFO:__main__:Writing 100000 documents to mongodb finished in 52.571 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1575137386
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:85: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fa37bf5f848> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fa37bf5f938> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fa37bf5f9b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fa37bf5fa28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fa37bf5faa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fa37bf5fb90> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fa37bf5fc08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fa37bf5fc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fa37bf5fcf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fa37bf5fe60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fa37bf5fed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fa37bf5ff50> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fa370eed310> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/1)))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/0))))))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 21.354 seconds

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_53-2043035810453986453?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_09_03-350776916843078518?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_17_52-8999207406051685529?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 717, in _RunMethod
    http = self.__client.http
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 324, in http
    @property
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5466.492s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_02_02-10625095481526485865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_16_25-17212469883613234769?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_24_08-9159443122461388847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_30_43-5856499177835473125?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_37_33-14637036451842972407?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_44_56-8085010551727131545?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_51_46-4765860714345659598?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_55-17953381351743751734?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_19_14-2835426087498011637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_27_17-1743468660108317247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_34_20-4249396903678764432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_41_15-745332379698211949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_59-16073022993848135517?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_14_09-7396233147966540101?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_20_38-7475828128619221259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_27_06-12515541307774503160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_33_33-4286484303844675999?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_55-14920928390767598351?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_21_11-6883838598776258803?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_28_23-10756316836391834194?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_35_19-1179468522304054868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_56-15908369351712971049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_09_31-3764771809782010850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_17_34-10840824016121511468?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_24_44-7678920390995232696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_31_40-9407024375479433680?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_58-14958847216808086248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_09_52-2762275398982537?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_17_12-1177038203876575065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_25_05-10052277054454569747?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_31_44-8231559464571296074?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_39_39-8935015611729525066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_55-15980942358089003296?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_10_35-7848820637959990487?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_19_51-16902147212322759700?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_37_36-5900831866044888758?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 17s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/cieglzpvgf3iu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1108

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1108/display/redirect>

Changes:


------------------------------------------
[...truncated 1.80 MB...]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7f27a2fda848> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7f27a2fda938> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f27a2fda9b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7f27a2fdaa28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f27a2fdaaa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f27a2fdab90> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f27a2fdac08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f27a2fdac80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f27a2fdacf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f27a2fdae60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f27a2fdaed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f27a2fdaf50> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f27a3e2bd10> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 31.978 seconds

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_04-14639107881094709866?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_10_12-6287029015039158086?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_19_19-5810427278672216362?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5552.559s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_10-6696166598122794198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_16_42-5458767247820086216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_25_04-1482269236036516896?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_31_51-810077808949706556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_38_32-12084891252161423438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_04-1834912321534474431?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_23_02-12141347480873447777?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_29_48-4032050613698193143?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_36_16-8601290841398136889?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_09-13917784102744926829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_14_26-7436608233782456799?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_20_06-18346748004886277989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_26_26-11212997030985452001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_32_47-6604438070583777110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_10-18058984742783897125?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_21_14-15914564844961364938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_29_28-11915322564305111628?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_37_15-2012503414921755538?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_43_48-13747495427943399617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_05-7705606811431272798?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_10_39-939890168552954606?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_17_57-16329710893903660970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_24_32-14441157014091874891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_30_53-2416590855440258237?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_37_47-1251909276511102140?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_45_06-14897379544934649581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_52_25-3276138446206233601?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_06-18331480055924307684?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_11_47-6135536612198244280?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_21_17-3347950687627471119?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_38_56-1222503843871811613?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_02_08-7383942675566189637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_10_17-17613108679966141210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_18_06-18250198490009325432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_24_52-16901651052899854788?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_04_32_03-846596501365311824?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 43s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/qs5thmvbk5l2o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1107

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1107/display/redirect>

Changes:


------------------------------------------
[...truncated 1.88 MB...]
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/six-1.13.0.tar.gz", 
            "name": "six-1.13.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/setuptools-42.0.1.zip", 
            "name": "setuptools-42.0.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/setuptools-41.5.0.zip", 
            "name": "setuptools-41.5.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/setuptools-41.0.1.zip", 
            "name": "setuptools-41.0.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/six-1.12.0.tar.gz", 
            "name": "six-1.12.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/funcsigs-1.0.2.tar.gz", 
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/setuptools-41.5.1.zip", 
            "name": "setuptools-41.5.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/setuptools-41.2.0.zip", 
            "name": "setuptools-41.2.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/pickled_main_session", 
            "name": "pickled_main_session"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130063249-113574.1575095569.113692/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20191112"
      }
    ]
  }, 
  "name": "beamapp-jenkins-1130063249-113574", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputde4559f8-777e-4823-a78a-0a7b48484fad"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputde4559f8-777e-4823-a78a-0a7b48484fad", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputde4559f8-777e-4823-a78a-0a7b48484fad", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-11-30T06:33:03.660250Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-11-29_22_33_01-4836594182868583197'
 location: u'us-central1'
 name: u'beamapp-jenkins-1130063249-113574'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-11-30T06:33:03.660250Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-11-29_22_33_01-4836594182868583197]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_22_33_01-4836594182868583197?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5471.283s

FAILED (SKIP=5, errors=1, failures=2)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 28s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/pc4heb3rxm7s2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1106

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1106/display/redirect>

Changes:


------------------------------------------
[...truncated 2.07 MB...]
        "user_name": "count"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s8", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "format_result"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "format.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "ref_AppliedPTransform_format_10", 
        "user_name": "format"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s9", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "encode.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s8"
        }, 
        "serialized_fn": "ref_AppliedPTransform_encode_11", 
        "user_name": "encode"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s10", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s9"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_outputde5b2a17-bc7f-4985-904c-6b6f078331dc", 
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-11-30T00:09:28.360329Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-11-29_16_09_26-15493994667975359285'
 location: u'us-central1'
 name: u'beamapp-jenkins-1130000917-258988'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-11-30T00:09:28.360329Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-11-29_16_09_26-15493994667975359285]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_09_26-15493994667975359285?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5443.081s

FAILED (SKIP=5, errors=1, failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_02_18-4834125177891214859?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_16_19-3268972192525874946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_23_39-28376418992759971?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_31_05-6868699632393156517?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_02_15-7188857035956763460?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_21_47-3807134133318360999?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_28_48-13631371992782005307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_35_47-16831784639909184147?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_02_16-609292735549518411?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_14_53-13489684766365364038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_22_04-13028746858926555057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_28_42-7348728751333076259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_35_45-13230143026806140353?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_42_55-4741555218303368659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_49_44-12001865589871654404?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_02_14-9750619480966644825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_20_39-4867709520509541045?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_38_17-8852491288874492287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_02_14-18268605103961197057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_09_57-8594370480441641956?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_16_42-16918301366639467523?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_23_31-14222689255303389483?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_31_07-12791084957367540828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_38_11-17020518716749944300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_02_15-3306696460426714801?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_09_37-4711778994888397115?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_17_13-440858025453099928?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_24_19-7558362285754720510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_30_56-5735209847149887824?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_02_15-13851520082520411179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_10_35-9194937862034082158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_19_36-5398178630378899334?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_27_13-11514782318766391853?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_34_28-4349612480122283256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_16_41_22-15477027011416860389?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 31m 59s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/gwhzyc35cs2ow

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1105

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1105/display/redirect>

Changes:


------------------------------------------
[...truncated 1.59 MB...]
	details = "Socket closed"
	debug_error_string = "{"created":"@1575051069.160609597","description":"Error received from peer ipv4:127.0.0.1:39627","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575051069.160655138","description":"Error received from peer ipv4:127.0.0.1:35969","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575051069.160582738","description":"Error received from peer ipv4:127.0.0.1:37483","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575051069.160582738","description":"Error received from peer ipv4:127.0.0.1:37483","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5483.376s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 44s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/oykq7wtgdovqo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1104

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1104/display/redirect>

Changes:


------------------------------------------
[...truncated 1.81 MB...]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fbd3765cde8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fbd3765ce60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fbd3765ced8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fbd37678050> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fbd376780c8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fbd37678140> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fbd376781b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fbd37678320> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fbd37678398> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fbd37678410> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fbd357eb0d0> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 22.480 seconds
mongoioit27125
mongoioit27125

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_04-15415314868991117337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_09_46-18414941488052030130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_18_00-14359412374032652181?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5462.709s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_10-9768547866810730634?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_16_13-10489015616245869511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_22_59-8723509839009733860?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_29_05-7097292968978888415?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_35_57-3696945241316830789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_42_54-8868771315548977640?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_49_45-5048847822173666943?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_04-5965966447347329940?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_21_23-730799349046482303?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_27_57-12651582167381739556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_35_17-17228154877790440046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_09-2340181632782259508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_14_56-6568414517974484517?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_21_25-12546796594859408350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_28_13-12517286090512748004?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_34_57-3611789164773490527?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_04-8015469202610840887?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_19_44-3129868374911739270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_27_25-8583971297950009710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_34_22-7195193685391702552?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_41_25-4540665672207332250?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_03-1436265357319848638?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_08_41-11713553190765351317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_17_32-14739357639016179472?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_25_06-4160238878502287017?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_31_25-16839410023365340893?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_38_24-17971715287175039301?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_05-1699631898220783230?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_09_44-10315080780213686898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_17_36-11081882419929286458?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_23_55-17423192327556843626?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_31_18-9010002072891464217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_02_04-13591990950626447101?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_10_42-9857751966023278111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_20_45-18093451595841774637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_04_38_03-15299050325468472399?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 25s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/ryt52cxpugltm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1103

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1103/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-4776] Add metrics support to Java PortableRunner


------------------------------------------
[...truncated 3.88 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1129104357-292489.1575024237.292711/setuptools-41.0.1.zip", 
            "name": "setuptools-41.0.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1129104357-292489.1575024237.292711/six-1.12.0.tar.gz", 
            "name": "six-1.12.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1129104357-292489.1575024237.292711/funcsigs-1.0.2.tar.gz", 
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1129104357-292489.1575024237.292711/setuptools-41.5.1.zip", 
            "name": "setuptools-41.5.1.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1129104357-292489.1575024237.292711/setuptools-41.2.0.zip", 
            "name": "setuptools-41.2.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1129104357-292489.1575024237.292711/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1129104357-292489.1575024237.292711/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20191112"
      }
    ]
  }, 
  "name": "beamapp-jenkins-1129104357-292489", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15750242357946", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"name\": \"fruit\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '591', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Fri, 29 Nov 2019 10:44:09 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "Quota exceeded for quota metric 'Job creation requests' and limit 'Job creation requests per minute per user' of service 'dataflow.googleapis.com' for consumer 'project_number:844138762903'.",
    "status": "RESOURCE_EXHAUSTED",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.Help",
        "links": [
          {
            "description": "Google developer console API key",
            "url": "https://console.developers.google.com/project/844138762903/apiui/credential"
          }
        ]
      }
    ]
  }
}
>
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-11-29T10:44:13.424366Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-11-29_02_44_11-7535465988915911105'
 location: u'us-central1'
 name: u'beamapp-jenkins-1129104357-292489'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-11-29T10:44:13.424366Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-11-29_02_44_11-7535465988915911105]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_11-7535465988915911105?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 1308.955s

FAILED (SKIP=5, errors=15, failures=6)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_11-7535465988915911105?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_52-1307895067465284676?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_51_03-15109173874374346283?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_57_13-12553968719505360336?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_43_37-461848580336628199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_45_22-2719031903801533644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_25-3629819988836718478?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_45_43-8003820888778458440?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_52_42-7508423880843886571?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_17-8367534494201555730?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_45_38-16182453992846126248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_24-11623070806283289958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_54-5938661706030911079?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_51_34-2070362931687419588?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_58_07-12684602636691344354?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_43_24-15898583299651168274?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_44_46-16441555565828292553?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_45_15-9868755133181309890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_52_24-5735781673008665238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-29_02_46_34-3011810453706814782?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 16s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/cbpqgtpz3sryk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1102

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1102/display/redirect>

Changes:


------------------------------------------
[...truncated 1.81 MB...]
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575008019.666005796","description":"Error received from peer ipv4:127.0.0.1:35343","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575008019.666023385","description":"Error received from peer ipv4:127.0.0.1:39717","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575008019.665974895","description":"Error received from peer ipv4:127.0.0.1:46429","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1575008019.665974895","description":"Error received from peer ipv4:127.0.0.1:46429","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5464.487s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 23s
120 actionable tasks: 95 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/onwslx2jfebzs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1101

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1101/display/redirect?page=changes>

Changes:

[milantracy] [BEAM-8406] Add support for JSON format text tables


------------------------------------------
[...truncated 1.80 MB...]
	details = "Socket closed"
	debug_error_string = "{"created":"@1574992480.753529229","description":"Error received from peer ipv4:127.0.0.1:42553","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574992480.753652315","description":"Error received from peer ipv4:127.0.0.1:34263","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574992480.753529229","description":"Error received from peer ipv4:127.0.0.1:42553","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574992480.753644940","description":"Error received from peer ipv4:127.0.0.1:44855","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5475.190s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 26s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/v2ofh2d5zobl6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1100

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1100/display/redirect>

Changes:


------------------------------------------
[...truncated 2.01 MB...]
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574986716.552107373","description":"Error received from peer ipv4:127.0.0.1:37767","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_03_58-17595650825415703221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_11_08-3790095079764180865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_20_11-16492212468656109359?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5480.762s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_03_56-16927188939746154522?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_24_33-8664952342064860545?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_31_35-13882673287728476315?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_38_58-1849168172242524027?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_04_01-7559043540024561113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_18_58-7935964190033304222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_27_16-2502582518366350165?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_34_33-15843231830839181280?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_03_57-9727357109309680902?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_21_32-14338932928945475881?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_28_50-3949066905401848740?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_35_40-16288216559046620503?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_42_40-73897795384437878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_04_01-15218823577833247761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_16_48-17354865851872817650?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_23_28-13807291468688251799?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_30_06-4037136118407402473?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_37_02-11809832580245741684?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_03_58-15190604848730391452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_11_44-6118084090325248221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_20_03-16464146581128658013?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_26_35-9570403950378550280?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_34_01-10032981577492851548?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_41_57-5604710368638611155?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_03_55-3358241493820763077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_11_51-6476728962823477955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_19_07-1469382278172736629?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_25_32-395197538112886673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_31_22-6233325228940356122?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_39_02-13079400982337796696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_46_30-3848693649028190240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_53_10-3438860801709944891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_03_59-16079502627662835664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_12_14-14612444883174220337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_22_01-17718912050553934760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_16_39_33-8909252163353318790?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 37s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/ugpax7qlsjoha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1099

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1099/display/redirect>

Changes:


------------------------------------------
[...truncated 1.60 MB...]
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574964533.393702155","description":"Error received from peer ipv4:127.0.0.1:46467","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574964533.393702155","description":"Error received from peer ipv4:127.0.0.1:46467","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574964533.393717056","description":"Error received from peer ipv4:127.0.0.1:43225","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5484.586s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 26s
120 actionable tasks: 96 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/yyyb4vpzws66g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1098

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1098/display/redirect?page=changes>

Changes:

[mxm] [BEAM-8656] Update documentation for flink_master parameter

[coheigea] A fix for some TLS issues in the MongoDB IO


------------------------------------------
[...truncated 1.83 MB...]
	details = "Socket closed"
	debug_error_string = "{"created":"@1574954576.952975329","description":"Error received from peer ipv4:127.0.0.1:37507","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574954576.952919652","description":"Error received from peer ipv4:127.0.0.1:40965","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574954576.952919652","description":"Error received from peer ipv4:127.0.0.1:40965","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 620, in __ProcessHttpResponse
    return self.__client.DeserializeMessage(response_type, content)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 446, in DeserializeMessage
    message = encoding.JsonToMessage(response_type, data)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 123, in JsonToMessage
    return _ProtoJsonApiTools.Get().decode_message(message_type, message)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 294, in __decode_dictionary
    setattr(message, field.name, self.decode_field(field, value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 331, in decode_field
    field.message_type, json.dumps(value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 294, in __decode_dictionary
    setattr(message, field.name, self.decode_field(field, value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 331, in decode_field
    field.message_type, json.dumps(value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 215, in decode_message
    message.check_initialized()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",> line 807, in check_initialized
    for name, field in self.__by_name.items():
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5531.761s

FAILED (SKIP=5, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 37s
120 actionable tasks: 99 executed, 18 from cache, 3 up-to-date

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 7d0cbb1e-8845-47b3-9503-4cd927aea890
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1097

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1097/display/redirect?page=changes>

Changes:

[piotr.szczepanik] [BEAM-8819] Fix AvroCoder serialisation by introduction of

[piotr.szczepanik] Added missing license header for AvroGenericCoder

[piotr.szczepanik] Fixed code style violations

[piotr.szczepanik] Fixed missing AvroCoder -> AvroGenericCoder in python tests


------------------------------------------
[...truncated 2.18 MB...]
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s22"
        }, 
        "serialized_fn": "<string of 1340 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-11-28T14:08:22.797195Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-11-28_06_08_20-5333801521925908286'
 location: u'us-central1'
 name: u'beamapp-jenkins-1128140144-236709'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-11-28T14:08:22.797195Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-11-28_06_08_20-5333801521925908286]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_06_08_20-5333801521925908286?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-11-28_06_08_20-5333801521925908286 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:20.539Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-11-28_06_08_20-5333801521925908286. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:20.539Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-11-28_06_08_20-5333801521925908286.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:24.461Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:25.481Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.258Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.285Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.333Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from datastore/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.360Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.382Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.473Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.503Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.521Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read from datastore/UserQuery/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.544Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Reify into read from datastore/SplitQuery
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.568Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Write into read from datastore/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.593Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/GroupByWindow into read from datastore/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.618Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Values into read from datastore/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.641Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Flatten into read from datastore/Values
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.667Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from datastore/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.689Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/KeyWithVoid into read from datastore/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.717Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial into Globally/CombineGlobally(CountCombineFn)/KeyWithVoid
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.740Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.766Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.837Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/UnKey into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.859Z: JOB_MESSAGE_DETAILED: Unzipping flatten s19 for input s17.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.880Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.907Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.932Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.955Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.978Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:26.999Z: JOB_MESSAGE_DETAILED: Unzipping flatten s19-u40 for input s20-reify-value9-c38
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.027Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.049Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.076Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.096Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.120Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into Globally/CombineGlobally(CountCombineFn)/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.141Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.165Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.189Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.215Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.236Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.263Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.287Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.403Z: JOB_MESSAGE_DEBUG: Executing wait step start51
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.450Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.480Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.480Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.501Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.505Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.534Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.548Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.548Z: JOB_MESSAGE_BASIC: Finished operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.586Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.610Z: JOB_MESSAGE_DEBUG: Value "read from datastore/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.639Z: JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.666Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:27.692Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read from datastore/SplitQuery+read from datastore/GroupByKey/Reify+read from datastore/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:08:57.507Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:09:01.287Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:10:07.903Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:10:07.927Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:11:29.190Z: JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:11:29.243Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:11:29.243Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/UserQuery/Read+read from datastore/SplitQuery+read from datastore/GroupByKey/Reify+read from datastore/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:11:29.353Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:11:29.417Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:11:29.448Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:13:34.066Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:13:34.101Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-28T14:13:34.127Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-11-28_06_08_20-5333801521925908286 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5465.254s

FAILED (SKIP=5, errors=3)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 10s
120 actionable tasks: 95 executed, 22 from cache, 3 up-to-date

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 300953ec-5f67-4d10-8165-8701436f3586
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1096

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1096/display/redirect>

Changes:


------------------------------------------
[...truncated 1.81 MB...]
mongoioit27413
mongoioit27413

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_46-12164176875535129494?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_09_27-8408803341603809135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_17_27-18144958137335745076?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5448.377s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_50-411026393510873294?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_15_54-3270727391131573651?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_24_12-12344538378539240909?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_31_45-16280654605678136575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_44-12804944004346293659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_19_13-8837850321117539197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_36_46-5862278497869295063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_49-11970917480017161715?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_14_07-8010195981349479569?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_20_55-6142474780240158623?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_28_20-7611502836719444284?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_35_29-13640079046164618934?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_45-8752185635859164176?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_18_11-9354290400413299436?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_26_03-15617645880255287214?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_32_41-3843100289791376956?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_39_33-12251524613102177857?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_44-16454512714580856727?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_09_33-10338669314711310241?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_17_07-440081576973237521?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_24_15-9865342233680796832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_30_28-11669103026420393458?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_38_40-8564428081365386605?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_43-10801976484396512639?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_08_20-16440888525082970956?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_16_49-9124452824684358832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_23_16-7661321548529219733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_29_27-8712286601575815507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_36_45-5442488514072501189?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_43_54-17682329428232579233?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_50_45-17236463199320226060?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_01_44-3438821434055227061?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_10_37-13132257736817531039?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_21_00-9211380196156515699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_28_03-5331398764000401192?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-28_04_35_19-2577598849403012435?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 31m 57s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 3fab8ee8-bc8b-4866-8708-25fa631bd853
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1095

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1095/display/redirect>

Changes:


------------------------------------------
[...truncated 2.00 MB...]
mongoioit27880
mongoioit27880

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_13-14448872360932719574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_10_04-6328471456405942760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_18_20-6856645736911637172?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5486.610s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_19-2259660922154894475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_16_21-13470199914577177354?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_24_08-15361927775867581307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_31_32-17947450528374686248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_38_47-1726332926435808484?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_46_56-11304667215591339030?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_53_51-14282921440263048558?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_13-17053063358064058886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_20_04-13793945166973682953?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_37_51-1076251108153021677?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_18-447338526156186748?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_15_07-1321619007664197094?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_22_14-7938097256380852170?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_29_05-13153874251250434745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_35_56-17704156824050590089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_12-3037101822922297179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_18_18-8180486393289996476?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_24_43-4504773933152794636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_31_58-3484404459696356813?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_39_39-9899297062767503905?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_14-15769503576419211595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_09_30-15409617911331573827?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_17_56-10072179258044499311?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_26_06-2351134277747336937?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_34_41-3009605907015457588?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_41_45-3248043019938220430?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_14-12514528943918636008?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_10_23-3586619152758273835?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_18_35-6879179303974559523?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_26_04-16301162467190115611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_33_50-152600361783691778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_02_14-213438589802936300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_11_27-12880877846828006034?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_21_29-4850355158248595025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_28_50-17314944924097385053?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_22_36_13-5634104774171824429?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 43s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 36317b90-27d1-4979-ab20-3d65ed748b23
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1094

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1094/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8842] Temporarily disable test


------------------------------------------
[...truncated 1.82 MB...]

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 22.498 seconds

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior.
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_25-16845209282782879222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_30_52-14180649259197584181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_40_01-3760589146814067573?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1457, in cancel
    self.job_id(), 'JOB_STATE_CANCELLED'):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 643, in modify_job_state
    self._client.projects_locations_jobs.Update(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 758, in Update
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5501.599s

FAILED (SKIP=5, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_29-10389893212246392065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_37_50-1365760199504336280?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_45_17-2558393071114419753?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_52_42-16963652495515618385?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_19_00_08-12819876663185567454?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_19_07_21-12995505697962934016?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_19_14_12-14344595080866639132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_25-2074728226119838922?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_41_22-9490633176715118264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_59_23-1930086846645633946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_26-18429763751050978977?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_36_10-16483550393723155710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_42_36-17454426534584648371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_49_16-10085403388448347554?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_56_16-16723751175653910446?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_25-15930569826459433544?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_40_08-14981987092411689735?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_48_16-9960331685580497203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_55_32-9790004436783718564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_19_02_10-18268830211345937690?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_24-15238586357605497085?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_31_07-3815271709153340174?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_39_24-13868886905713953974?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_46_16-2657453956078989759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_53_16-17495061443307184828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_25-14305085180680317342?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_31_21-9340852500840202297?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_39_07-16805073285262844186?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_46_30-15984195953234686247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_53_25-11610632090290444934?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_19_01_03-17638938061681290591?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_23_25-4342729214883425737?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_31_59-9254880715347049219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_41_53-5500766575803519999?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_49_09-575340084706339255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_18_56_56-9223175600653161562?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 39s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 00f94eba-d3bc-49da-838b-d99554633abf
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1093

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1093/display/redirect>

Changes:


------------------------------------------
[...truncated 1.89 MB...]

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574902849.113906977","description":"Error received from peer ipv4:127.0.0.1:33059","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574902849.113963547","description":"Error received from peer ipv4:127.0.0.1:43155","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5537.986s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 36m 42s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 64cb6ead-132c-4556-9e8b-d6b93f1a3688
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1092

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1092/display/redirect?page=changes>

Changes:

[github] [BEAM-8840][BEAM-3713] Remove setup_requires, tests_require from


------------------------------------------
[...truncated 1.83 MB...]
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574894888.835391019","description":"Error received from peer ipv4:127.0.0.1:43763","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
INFO:__main__:Read 100000 documents from mongodb finished in 21.973 seconds

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_51-15899943996149264121?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_44_18-11399388240265051890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_54_12-10922324920216554068?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5485.912s

FAILED (SKIP=4, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_58-11183864918334731777?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_52_34-18046328409595508471?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_01_50-16527727559553214010?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_10_00-7566985689607277647?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_52-16031234573115394984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_55_45-15954970550205968978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_12_55-2218944108233839617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_19_35-16935069256699600529?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_56-499238070050129891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_51_02-7431824811214050232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_57_29-595745473187312763?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_04_30-4662109961745295428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_10_56-4887146852594990683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_52-355589812886404005?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_54_21-9860511637746073699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_03_12-13260374634806576702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_10_14-6829397680422172038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_17_12-13367520986085592810?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_52-14996981588936732866?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_45_24-12317596702614303819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_53_31-12638826580667996586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_01_11-15612958126973579106?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_08_41-2399045597554745463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_16_06-17523326641873751625?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_23_31-9192195039818090105?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_30_36-11795269624807891993?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_52-10970057049233663158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_45_38-2108613150992212188?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_53_41-12967445744276943649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_00_32-12366267845815354682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_07_37-4897560764834673507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_14_50-9490667776463656935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_37_52-18175144157799337667?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_46_45-14116186624490252359?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_14_57_11-1341110550602596719?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_04_32-9260491488356165312?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-27_15_11_28-5918906283421296198?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 27s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/m3ig6iheloom6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1091

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1091/display/redirect?page=changes>

Changes:

[amyrvold] [BEAM-8832] Allow GCS staging upload chunk size to be increased >1M when


------------------------------------------
[...truncated 959.33 KB...]
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-2476833a-474b-4a99-9ee7-5dab4de6d788
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-d99b7d46-7c44-49da-9c4e-708015b8d84b
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-fea3e052-9078-49e1-bb2b-040bbbf99d9b
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:34081
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 16658 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 28, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 15, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 28, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 15, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}))
[flink-runner-job-invoker] ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_c5629961-889c-45d9-8417-fa6625e22f37","basePath":"/tmp/beam-artifact-staging"}: {}
java.io.FileNotFoundException: /tmp/beam-artifact-staging/job_c5629961-889c-45d9-8417-fa6625e22f37/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 620, in __ProcessHttpResponse
    return self.__client.DeserializeMessage(response_type, content)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 446, in DeserializeMessage
    message = encoding.JsonToMessage(response_type, data)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 123, in JsonToMessage
    return _ProtoJsonApiTools.Get().decode_message(message_type, message)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 287, in __decode_dictionary
    for item in value]
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 331, in decode_field
    field.message_type, json.dumps(value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 294, in __decode_dictionary
    setattr(message, field.name, self.decode_field(field, value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",> line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",> line 1293, in __set__
    value = self.validate(value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",> line 1400, in validate
    return self.__validate(value, self.validate_element)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",> line 1358, in __validate
    return validate_element(value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5517.813s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 15s
115 actionable tasks: 90 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/he6grnrjjbgje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1090

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1090/display/redirect?page=changes>

Changes:

[github] Bump python precommit timeout to 3hrs


------------------------------------------
[...truncated 945.05 KB...]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (18b6b3abad9d801fab2e0ad40fd83c95).
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (18b6b3abad9d801fab2e0ad40fd83c95) [FINISHED]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (1/2) (attempt #0) to 896df724-47fb-4444-80b4-30f1506833d4 @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 18b6b3abad9d801fab2e0ad40fd83c95.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (1/2).
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (18b6b3abad9d801fab2e0ad40fd83c95) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) [DEPLOYING]
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7).
[DataSink (DiscardingOutput) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) [FINISHED]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 846cc7ab8cbdb5aba090a3fc327e05b7.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (846cc7ab8cbdb5aba090a3fc327e05b7) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-root-1127193136-2171b418 (5cdb4f483e10104b9212eb2374731c59) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 5cdb4f483e10104b9212eb2374731c59 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-root-1127193136-2171b418(5cdb4f483e10104b9212eb2374731c59).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8138}, allocationId: 7bc7207d840f470abe0dbe354547e0ea, jobId: 5cdb4f483e10104b9212eb2374731c59).
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection c06f681c418913a3d13e110215ec4b5f.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection 896df724-47fb-4444-80b4-30f1506833d4 because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection c06f681c418913a3d13e110215ec4b5f: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 80ed0aa07958409113cc994307d8451c@akka://flink/user/jobmanager_1 for job 5cdb4f483e10104b9212eb2374731c59 from the resource manager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-9ae7d479-a4ac-4f51-8be9-3eedcd1f3434
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-47a1be3f-07fc-458b-843a-b97e46bcd8f9
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-6cbd4344-aeb6-4f66-9ba4-54c05374f356
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:36545
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 22724 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 7, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 17, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 11, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 17, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 7, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 14, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}))
[flink-runner-job-invoker] ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_d8f13ea4-77bd-4a8c-a506-77fa9c207271","basePath":"/tmp/beam-artifact-staging"}: {}
java.io.FileNotFoundException: /tmp/beam-artifact-staging/job_d8f13ea4-77bd-4a8c-a506-77fa9c207271/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test suite for <class 'apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests'>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/suite.py",> line 177, in __call__
    return self.run(*arg, **kw)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 822, in run
    test.config.plugins.addError(test,err)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/manager.py",> line 99, in __call__
    return self.call(*arg, **kw)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/manager.py",> line 167, in simple
    result = meth(*arg, **kw)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/xunit.py",> line 287, in addError
    tb = format_exception(err, self.encoding)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/pyversion.py",> line 214, in format_exception
    ''.join(traceback.format_exception(*exc_info)),
  File "/usr/lib/python2.7/traceback.py", line 141, in format_exception
    list = list + format_tb(tb, limit)
  File "/usr/lib/python2.7/traceback.py", line 76, in format_tb
    return format_list(extract_tb(tb, limit))
  File "/usr/lib/python2.7/traceback.py", line 101, in extract_tb
    line = linecache.getline(filename, lineno, f.f_globals)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/lib/python2.7/linecache.py",> line 14, in getline
    lines = getlines(filename, module_globals)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/lib/python2.7/linecache.py",> line 41, in getlines
    return updatecache(filename, module_globals)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/lib/python2.7/linecache.py",> line 132, in updatecache
    lines = fp.readlines()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: '<nose.plugins.multiprocess.NoSharedFixtureContextSuite context=BigQueryStreamingInsertTransformIntegrationTests>'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5536.756s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 48s
115 actionable tasks: 89 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/qow5x6vl35lfm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1089

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1089/display/redirect>

Changes:


------------------------------------------
[...truncated 733.72 KB...]
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/11/27 18:15:53 INFO datanode.webhdfs: 192.168.160.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/11/27 18:15:53 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=192.168.160.3:50010 for /kinglear.txt
datanode_1  | 19/11/27 18:15:53 INFO datanode.DataNode: Receiving BP-78798608-192.168.160.2-1574878503515:blk_1073741825_1001 src: /192.168.160.3:40382 dest: /192.168.160.3:50010
datanode_1  | 19/11/27 18:15:53 INFO DataNode.clienttrace: src: /192.168.160.3:40382, dest: /192.168.160.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-328139826_67, offset: 0, srvID: 74704b06-2cb0-4c37-8984-8dd3d83b905b, blockid: BP-78798608-192.168.160.2-1574878503515:blk_1073741825_1001, duration: 16274153
datanode_1  | 19/11/27 18:15:53 INFO datanode.DataNode: PacketResponder: BP-78798608-192.168.160.2-1574878503515:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/27 18:15:53 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/11/27 18:15:53 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/11/27 18:15:53 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-328139826_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | /usr/local/lib/python2.7/site-packages/apache_beam/__init__.py:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
test_1      |   'You are using Apache Beam with Python 2. '
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7f3782915550> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7f3782915650> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f37829156d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7f3782915750> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f37829157d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f37829158d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f3782915950> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f37829159d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f3782915a50> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f3782915bd0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f3782915c50> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f3782915cd0> ====================
test_1      | INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f3782842590> for environment urn: "beam:env:embedded_python:v1"
test_1      | 
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/11/27 18:15:56 INFO datanode.webhdfs: 192.168.160.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:apache_beam.io.hadoopfilesystem:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/11/27 18:15:59 INFO datanode.webhdfs: 192.168.160.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-f4915536114111eaa04b0242c0a8a004/941336de-2270-4724-821a-1ecaf0cd1c1c.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/11/27 18:15:59 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=192.168.160.3:50010 for /beam-temp-py-wordcount-integration-f4915536114111eaa04b0242c0a8a004/941336de-2270-4724-821a-1ecaf0cd1c1c.py-wordcount-integration
datanode_1  | 19/11/27 18:15:59 INFO datanode.DataNode: Receiving BP-78798608-192.168.160.2-1574878503515:blk_1073741826_1002 src: /192.168.160.3:40408 dest: /192.168.160.3:50010
datanode_1  | 19/11/27 18:15:59 INFO DataNode.clienttrace: src: /192.168.160.3:40408, dest: /192.168.160.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1845469451_69, offset: 0, srvID: 74704b06-2cb0-4c37-8984-8dd3d83b905b, blockid: BP-78798608-192.168.160.2-1574878503515:blk_1073741826_1002, duration: 3981435
datanode_1  | 19/11/27 18:15:59 INFO datanode.DataNode: PacketResponder: BP-78798608-192.168.160.2-1574878503515:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/27 18:15:59 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-f4915536114111eaa04b0242c0a8a004/941336de-2270-4724-821a-1ecaf0cd1c1c.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-1845469451_69
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python2-1089_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python2-1089_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1089_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1089_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python2-1089_namenode_1 ... done
Aborting on container exit...

real	1m36.231s
user	0m1.147s
sys	0m0.183s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python2-1089 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python2-1089_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1089_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1089_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1089_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1089_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1089_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python2-1089_test_net

real	0m0.800s
user	0m0.593s
sys	0m0.114s

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
Using default tag: latest
latest: Pulling from library/mongo
Digest: sha256:1a9478d8188d6be31dd2e8de076d402edf20446e54933aad7ff49f5b457d486c
Status: Image is up to date for mongo:latest
3dc052ba6b182d8e3d830035f71729618682fa7227198feeb36971180b4675e7
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.25.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.16.5)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.9.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.11.0)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.8)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.6.6)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.15.1)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.7.0,>=0.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.3)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (4.6.6)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.30.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.18.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.18.0.dev0) (42.0.1)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.18.0.dev0) (2.4.5)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.18.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.2)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.1.7)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.23)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2019.9.11)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.8)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.18.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Writing 100000 documents to mongodb
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:70: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size)
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fbb44266848> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fbb44266938> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fbb442669b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fbb44266a28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fbb44266aa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fbb44266b90> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fbb44266c08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fbb44266c80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fbb44266cf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fbb44266e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fbb44266ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fbb44266f50> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fbb44690b10> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_AppliedPTransform_Create documents/Read_3)+((ref_AppliedPTransform_WriteToMongoDB/ParDo(_GenerateObjectIdFn)_5)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:apache_beam.runners.portability.fn_api_runner:Running ((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16))
INFO:__main__:Writing 100000 documents to mongodb finished in 51.981 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1574878586
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:85: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fbb44266848> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fbb44266938> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fbb442669b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fbb44266a28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fbb44266aa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fbb44266b90> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fbb44266c08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fbb44266c80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fbb44266cf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fbb44266e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fbb44266ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fbb44266f50> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fbb44eedf10> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 23.570 seconds
mongoioit27152
mongoioit27152

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 45s
111 actionable tasks: 85 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/2c4vq6ta3gh3k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1088

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1088/display/redirect>

Changes:


------------------------------------------
[...truncated 505.66 KB...]
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side1 -> Map 7d00f65cf62f362b0ff71f00837c19ad.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side1 -> Map (1/2) (0c8026a83396003366d0a439c44a6eac) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side1 -> Map (2/2) (7d00f65cf62f362b0ff71f00837c19ad) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (6f2880a900d84fb1af5595e270e23f2a) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (6f2880a900d84fb1af5595e270e23f2a).
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (6f2880a900d84fb1af5595e270e23f2a) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem 6f2880a900d84fb1af5595e270e23f2a.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (6f2880a900d84fb1af5595e270e23f2a) switched from RUNNING to FINISHED.
WARNING:apache_beam.io.filebasedsink:Deleting 2 existing files in target path matching: -*-of-%(num_shards)05d
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (7ac6436a6e7e7ee4ce80334d7c039cc6) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (7ac6436a6e7e7ee4ce80334d7c039cc6).
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (7ac6436a6e7e7ee4ce80334d7c039cc6) [FINISHED]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem 7ac6436a6e7e7ee4ce80334d7c039cc6.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (000bdce5c470ba35eb5c0ffe9bac7fb4) switched from RUNNING to FINISHED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (000bdce5c470ba35eb5c0ffe9bac7fb4).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (000bdce5c470ba35eb5c0ffe9bac7fb4) [FINISHED]
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (b29724bbc197d3ef400ece482fde0d9c) switched from RUNNING to FINISHED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (b29724bbc197d3ef400ece482fde0d9c).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (b29724bbc197d3ef400ece482fde0d9c) [FINISHED]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 000bdce5c470ba35eb5c0ffe9bac7fb4.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map b29724bbc197d3ef400ece482fde0d9c.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (7ac6436a6e7e7ee4ce80334d7c039cc6) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/FinalizeWrite (1/2) (33f2dee6730a075af29efad6b5e10065) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/FinalizeWrite (1/2) (33f2dee6730a075af29efad6b5e10065).
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/FinalizeWrite (1/2) (33f2dee6730a075af29efad6b5e10065) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/FinalizeWrite 33f2dee6730a075af29efad6b5e10065.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (000bdce5c470ba35eb5c0ffe9bac7fb4) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (b29724bbc197d3ef400ece482fde0d9c) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/FinalizeWrite (1/2) (33f2dee6730a075af29efad6b5e10065) switched from RUNNING to FINISHED.
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 2 (skipped: 0), batches: 2, num_threads: 2
INFO:apache_beam.io.filebasedsink:Renamed 2 shards in 0.12 seconds.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Closing environment urn: "beam:env:external:v1"
payload: "\n\021\022\017localhost:34821"

INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight requests to complete
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/FinalizeWrite (2/2) (2d6df793c4aee08e4dbcc7ae1fb4f730) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/FinalizeWrite (2/2) (2d6df793c4aee08e4dbcc7ae1fb4f730).
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/FinalizeWrite (2/2) (2d6df793c4aee08e4dbcc7ae1fb4f730) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/FinalizeWrite 2d6df793c4aee08e4dbcc7ae1fb4f730.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/FinalizeWrite (2/2) (2d6df793c4aee08e4dbcc7ae1fb4f730) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-jenkins-1127120523-2c0ba705 (19009150a211989c62a926999131577b) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint coordinator for job 19009150a211989c62a926999131577b.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore - Shutting down
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 19009150a211989c62a926999131577b reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-jenkins-1127120523-2c0ba705(19009150a211989c62a926999131577b).
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection 7444da6ccbadc76560fb5d065d33259f.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection 256fc9da-83ef-4f23-a216-823d7ce6ab8b because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 7444da6ccbadc76560fb5d065d33259f: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager bb63fd4467b61f42cb2b7ebf6e8e4ffa@akka://flink/user/jobmanager_1 for job 19009150a211989c62a926999131577b from the resource manager.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-1e77f9ab-c5c3-4951-8bba-e3f049fb9eeb
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-23e4c944-74a6-48b1-9117-6dded97a974c
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-5edb2324-4f0f-43ac-9a9f-29292eefa5db
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-12] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:36353
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 40200 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=words}: 96, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 27, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 96, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 1, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_lengths}: 298, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 2, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0)Distributions(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=653, count=32, min=18, max=27}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=831, count=35, min=20, max=29}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=640, count=33, min=17, max=26}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=106, count=2, min=53, max=53}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=14, count=1, min=14, max=14}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=685, count=1, min=685, max=685}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=656, count=31, min=19, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=81, count=1, min=81, max=81}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=276, count=2, min=138, max=138}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=278, count=2, min=139, max=139}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=13, count=1, min=13, max=13}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=276, count=2, min=138, max=138}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=271, count=1, min=271, max=271}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=749, count=20, min=14, max=84}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=608, count=37, min=14, max=23}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=770, count=42, min=16, max=25}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=685, count=1, min=685, max=685}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: DistributionResult{sum=278, count=2, min=139, max=139}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempIgAHe1/artifacts4rWztU/job_4f3937e9-9e9c-475e-951a-f794a634b5f5/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempIgAHe1/artifacts4rWztU/job_4f3937e9-9e9c-475e-951a-f794a634b5f5/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1127120523-2c0ba705_ecbe0f47-cd8b-4bf0-9c58-40f0711c4583
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1127120523-2c0ba705_ecbe0f47-cd8b-4bf0-9c58-40f0711c4583
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/11/27 12:06:11 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:37469
19/11/27 12:06:11 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:34495
19/11/27 12:06:11 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:56291
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/11/27 12:06:13 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1127120613-b718ce07_69fb06bb-d348-453c-b273-254a2b6784b7
19/11/27 12:06:13 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1127120613-b718ce07_69fb06bb-d348-453c-b273-254a2b6784b7
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/27 12:06:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/27 12:06:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/27 12:06:13 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/11/27 12:06:14 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/11/27 12:06:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1127120613-b718ce07_69fb06bb-d348-453c-b273-254a2b6784b7 on Spark master local[4]
19/11/27 12:06:14 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/11/27 12:06:14 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:39161.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/11/27 12:06:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:41351.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:39427
19/11/27 12:06:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:06:17 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/11/27 12:06:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1127120613-b718ce07_69fb06bb-d348-453c-b273-254a2b6784b7: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/11/27 12:06:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1127120613-b718ce07_69fb06bb-d348-453c-b273-254a2b6784b7 finished.
19/11/27 12:06:57 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 12:06:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempX4726s/artifactsqU4xqk/job_3d57d784-1ee8-4aa7-b6e8-63c98f11e104/MANIFEST has 1 artifact locations
19/11/27 12:06:57 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempX4726s/artifactsqU4xqk/job_3d57d784-1ee8-4aa7-b6e8-63c98f11e104/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/11/27 12:06:58 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1127120613-b718ce07_69fb06bb-d348-453c-b273-254a2b6784b7
19/11/27 12:06:58 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1127120613-b718ce07_69fb06bb-d348-453c-b273-254a2b6784b7
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1574856418.400824584","description":"Error received from peer ipv4:127.0.0.1:41351","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1574856418.400790983","description":"Error received from peer ipv4:127.0.0.1:39427","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1574856418.400790983","description":"Error received from peer ipv4:127.0.0.1:39427","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574856418.400831157","description":"Error received from peer ipv4:127.0.0.1:39161","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 32s
112 actionable tasks: 86 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/qiym6otgukpyu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1087

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1087/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-8470] Exclude failed ValidatesRunner tests


------------------------------------------
[...truncated 507.26 KB...]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (88a4c08f425d459a6ed50fe606c05338) switched from RUNNING to FINISHED.
WARNING:apache_beam.io.filebasedsink:Deleting 2 existing files in target path matching: -*-of-%(num_shards)05d
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (d641b501dde9215b1965adce486ca288) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (d641b501dde9215b1965adce486ca288).
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (d641b501dde9215b1965adce486ca288) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem d641b501dde9215b1965adce486ca288.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (0f77fc481cc40577708ee9c6e4c4d52e) switched from RUNNING to FINISHED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (0f77fc481cc40577708ee9c6e4c4d52e).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (62642dc01b9e75c36dc086abc1314137) switched from RUNNING to FINISHED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (62642dc01b9e75c36dc086abc1314137).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (0f77fc481cc40577708ee9c6e4c4d52e) [FINISHED]
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (62642dc01b9e75c36dc086abc1314137) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 0f77fc481cc40577708ee9c6e4c4d52e.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 62642dc01b9e75c36dc086abc1314137.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) (d641b501dde9215b1965adce486ca288) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (0f77fc481cc40577708ee9c6e4c4d52e) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (62642dc01b9e75c36dc086abc1314137) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/FinalizeWrite (1/2) (e4d9e5544259c906a346765774167c2f) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/FinalizeWrite (1/2) (e4d9e5544259c906a346765774167c2f).
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/FinalizeWrite (1/2) (e4d9e5544259c906a346765774167c2f) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/FinalizeWrite e4d9e5544259c906a346765774167c2f.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/FinalizeWrite (1/2) (e4d9e5544259c906a346765774167c2f) switched from RUNNING to FINISHED.
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 2 (skipped: 0), batches: 2, num_threads: 2
INFO:apache_beam.io.filebasedsink:Renamed 2 shards in 0.12 seconds.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Closing environment urn: "beam:env:external:v1"
payload: "\n\021\022\017localhost:41943"

INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight requests to complete
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers.
[grpc-default-executor-1] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/FinalizeWrite (2/2) (b781349bf47e4f73322d1a6dc4a9df69) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/FinalizeWrite (2/2) (b781349bf47e4f73322d1a6dc4a9df69).
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/FinalizeWrite (2/2) (b781349bf47e4f73322d1a6dc4a9df69) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/FinalizeWrite b781349bf47e4f73322d1a6dc4a9df69.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/FinalizeWrite (2/2) (b781349bf47e4f73322d1a6dc4a9df69) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-jenkins-1127090753-f71dfc8d (96ac8178d340da4db39858148a36b483) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint coordinator for job 96ac8178d340da4db39858148a36b483.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore - Shutting down
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 96ac8178d340da4db39858148a36b483 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-jenkins-1127090753-f71dfc8d(96ac8178d340da4db39858148a36b483).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8137}, allocationId: 3a6cf8457aed7b75a6bceb7c903948bd, jobId: 96ac8178d340da4db39858148a36b483).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection b578e88926374b41de22a1a048ace09e: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager a30ccc33629c1e3ab47562d6039d4ff3@akka://flink/user/jobmanager_1 for job 96ac8178d340da4db39858148a36b483 from the resource manager.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 96ac8178d340da4db39858148a36b483 with leader id a30ccc33629c1e3ab47562d6039d4ff3 lost leadership.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8137}, allocationId: af98d06735dbec2ae477133421438dcb, jobId: 96ac8178d340da4db39858148a36b483).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 96ac8178d340da4db39858148a36b483 from job leader monitoring.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 96ac8178d340da4db39858148a36b483.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 96ac8178d340da4db39858148a36b483.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job 96ac8178d340da4db39858148a36b483 because it is not registered.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection b578e88926374b41de22a1a048ace09e.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection d4f530e0-d580-491f-b18f-428ff2a04076 because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-534c00f5-3574-4007-aebc-d22a404f4be4
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-13542732-0117-4f05-aa0d-ddce536fd46f
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-08a27d5b-5040-4fa2-844a-806ee3e84226
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44619
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 3755 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=words}: 96, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 27, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 96, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 2, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: 2, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_lengths}: 298, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0)Distributions(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=712, count=35, min=18, max=27}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=806, count=34, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=658, count=34, min=17, max=26}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=14, count=1, min=14, max=14}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=685, count=1, min=685, max=685}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=106, count=2, min=53, max=53}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=727, count=34, min=19, max=28}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=276, count=2, min=138, max=138}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=278, count=2, min=139, max=139}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=13, count=1, min=13, max=13}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=81, count=1, min=81, max=81}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=276, count=2, min=138, max=138}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=271, count=1, min=271, max=271}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=754, count=20, min=14, max=84}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=605, count=37, min=14, max=20}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=561, count=30, min=16, max=25}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=685, count=1, min=685, max=685}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: DistributionResult{sum=278, count=2, min=139, max=139}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-temp5Ad7fM/artifactsZ8gKU4/job_fe146d6f-fcf4-4507-a94e-5ecb4c0cf82c/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-temp5Ad7fM/artifactsZ8gKU4/job_fe146d6f-fcf4-4507-a94e-5ecb4c0cf82c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1127090753-f71dfc8d_dcb552a4-1712-47a3-a00f-68b169fa625c
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1127090753-f71dfc8d_dcb552a4-1712-47a3-a00f-68b169fa625c
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/11/27 09:08:05 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:42387
19/11/27 09:08:05 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:39271
19/11/27 09:08:06 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:42989
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/11/27 09:08:07 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1127090807-7644ce6b_9e7e21e1-2736-4ff4-bb36-2306d6439499
19/11/27 09:08:07 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1127090807-7644ce6b_9e7e21e1-2736-4ff4-bb36-2306d6439499
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/27 09:08:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/27 09:08:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/27 09:08:08 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/11/27 09:08:08 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/11/27 09:08:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1127090807-7644ce6b_9e7e21e1-2736-4ff4-bb36-2306d6439499 on Spark master local[4]
19/11/27 09:08:09 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/11/27 09:08:09 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42507.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/11/27 09:08:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37029.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:44339
19/11/27 09:08:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:08:12 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 09:08:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1127090807-7644ce6b_9e7e21e1-2736-4ff4-bb36-2306d6439499: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/11/27 09:08:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1127090807-7644ce6b_9e7e21e1-2736-4ff4-bb36-2306d6439499 finished.
19/11/27 09:08:16 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 09:08:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp5FX3z4/artifactsgKUL4L/job_8ee841d4-58ea-4ce9-a58a-096582af22bf/MANIFEST has 1 artifact locations
19/11/27 09:08:16 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp5FX3z4/artifactsgKUL4L/job_8ee841d4-58ea-4ce9-a58a-096582af22bf/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/11/27 09:08:16 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1127090807-7644ce6b_9e7e21e1-2736-4ff4-bb36-2306d6439499
19/11/27 09:08:16 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1127090807-7644ce6b_9e7e21e1-2736-4ff4-bb36-2306d6439499
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574845696.502085384","description":"Error received from peer ipv4:127.0.0.1:37029","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574845696.502148221","description":"Error received from peer ipv4:127.0.0.1:42507","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574845696.502062195","description":"Error received from peer ipv4:127.0.0.1:44339","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574845696.502062195","description":"Error received from peer ipv4:127.0.0.1:44339","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 13s
112 actionable tasks: 86 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/5wblyta5pyacm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1086

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1086/display/redirect>

Changes:


------------------------------------------
[...truncated 514.67 KB...]
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/FinalizeWrite 7812b4442529a1a553194286984012ff.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/FinalizeWrite (2/2) (7812b4442529a1a553194286984012ff) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-jenkins-1127060945-c88e9bff (ea53cefa289058ae89df6499cc4fe2e7) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint coordinator for job ea53cefa289058ae89df6499cc4fe2e7.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore - Shutting down
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1574835038.851786133","description":"Error received from peer ipv4:127.0.0.1:45495","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1574835038.851786133","description":"Error received from peer ipv4:127.0.0.1:45495","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job ea53cefa289058ae89df6499cc4fe2e7 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-jenkins-1127060945-c88e9bff(ea53cefa289058ae89df6499cc4fe2e7).
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 4656f4e9da579b69848137dc2e6a5850: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8136}, allocationId: aebaf65927dc4adf8e1994def77b3ec0, jobId: ea53cefa289058ae89df6499cc4fe2e7).
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager aed00098ea217f7f9d1919619c734c26@akka://flink/user/jobmanager_1 for job ea53cefa289058ae89df6499cc4fe2e7 from the resource manager.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-14] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job ea53cefa289058ae89df6499cc4fe2e7 with leader id aed00098ea217f7f9d1919619c734c26 lost leadership.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8136}, allocationId: af5e9af72f29834fbb7bd754d2d096b0, jobId: ea53cefa289058ae89df6499cc4fe2e7).
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job ea53cefa289058ae89df6499cc4fe2e7 from job leader monitoring.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job ea53cefa289058ae89df6499cc4fe2e7.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job ea53cefa289058ae89df6499cc4fe2e7.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job ea53cefa289058ae89df6499cc4fe2e7 because it is not registered.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection 4656f4e9da579b69848137dc2e6a5850.
[flink-akka.actor.default-dispatcher-13] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection 30c2d6d6-59bc-4399-8c17-46f991feaa71 because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-10c5aba3-187b-4570-bcec-47b0075e84a4
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-b7d5bb62-8aad-4a23-beaf-87bb0f61f1f7
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-a735f6b1-d509-478e-9d99-ede25a362a08
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-14] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:37047
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 50323 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=words}: 96, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 27, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 96, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 96, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 1, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_lengths}: 298, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0)Distributions(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=710, count=35, min=18, max=24}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=775, count=33, min=20, max=29}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=624, count=32, min=17, max=26}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=14, count=1, min=14, max=14}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=685, count=1, min=685, max=685}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=106, count=2, min=53, max=53}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=764, count=36, min=19, max=25}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=81, count=1, min=81, max=81}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=276, count=2, min=138, max=138}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=278, count=2, min=139, max=139}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=13, count=1, min=13, max=13}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=276, count=2, min=138, max=138}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=271, count=1, min=271, max=271}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=859, count=24, min=14, max=84}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=578, count=35, min=14, max=20}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=732, count=40, min=16, max=25}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=685, count=1, min=685, max=685}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: DistributionResult{sum=278, count=2, min=139, max=139}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempQg9xEz/artifactsgIsRpF/job_db070576-d433-450e-b2b8-d7b400ab8270/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempQg9xEz/artifactsgIsRpF/job_db070576-d433-450e-b2b8-d7b400ab8270/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1127060945-c88e9bff_1ab08535-6f29-4cec-b106-b35e326b8228
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1127060945-c88e9bff_1ab08535-6f29-4cec-b106-b35e326b8228
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/11/27 06:10:43 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:34517
19/11/27 06:10:43 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:39609
19/11/27 06:10:44 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:60805
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/11/27 06:10:45 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1127061045-283d787_84b8677e-9ca0-41dd-89e7-0a6101b647e6
19/11/27 06:10:46 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1127061045-283d787_84b8677e-9ca0-41dd-89e7-0a6101b647e6
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/27 06:10:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/27 06:10:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/27 06:10:46 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/11/27 06:10:46 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/11/27 06:10:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1127061045-283d787_84b8677e-9ca0-41dd-89e7-0a6101b647e6 on Spark master local[4]
19/11/27 06:10:47 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/11/27 06:10:47 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:46451.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/11/27 06:10:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:33293.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42461
19/11/27 06:10:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:10:50 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/11/27 06:11:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1127061045-283d787_84b8677e-9ca0-41dd-89e7-0a6101b647e6: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.17 seconds.
19/11/27 06:11:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1127061045-283d787_84b8677e-9ca0-41dd-89e7-0a6101b647e6 finished.
19/11/27 06:11:43 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 06:11:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp4Wr_oK/artifacts6VCVlD/job_0024fc24-095b-42c9-b97f-f548fb5f739e/MANIFEST has 1 artifact locations
19/11/27 06:11:43 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp4Wr_oK/artifacts6VCVlD/job_0024fc24-095b-42c9-b97f-f548fb5f739e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/11/27 06:11:43 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1127061045-283d787_84b8677e-9ca0-41dd-89e7-0a6101b647e6
19/11/27 06:11:43 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1127061045-283d787_84b8677e-9ca0-41dd-89e7-0a6101b647e6
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574835103.880300133","description":"Error received from peer ipv4:127.0.0.1:42461","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574835103.880284185","description":"Error received from peer ipv4:127.0.0.1:46451","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574835103.880300133","description":"Error received from peer ipv4:127.0.0.1:42461","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574835103.880277359","description":"Error received from peer ipv4:127.0.0.1:33293","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>



> Task :sdks:python:test-suites:portable:py2:postCommitPy2

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 40s
112 actionable tasks: 86 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/th64sgqf3qcys

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1085

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1085/display/redirect?page=changes>

Changes:

[aaltay] [BEAM-7390] Add code snippets for Count (#9923)

[aaltay] [BEAM-7390] Add code snippets for CombineGlobally (#9920)

[aaltay] [BEAM-7390] Add code snippets for CombineValues (#9922)

[aaltay] Fix sorting order bug. (#9883)


------------------------------------------
[...truncated 957.88 KB...]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-bdbd9714-4c26-4bcc-b940-4320037eb777
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:32897
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 23570 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 4, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 4, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}))
[flink-runner-job-invoker] ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_49d551f5-256c-49b1-b22f-237b25301fd7","basePath":"/tmp/beam-artifact-staging"}: {}
java.io.FileNotFoundException: /tmp/beam-artifact-staging/job_49d551f5-256c-49b1-b22f-237b25301fd7/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5497.882s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 54s
115 actionable tasks: 89 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/l6zmhoetpry3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1084

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1084/display/redirect>

Changes:


------------------------------------------
[...truncated 1.41 MB...]
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574813852.059595703","description":"Error received from peer ipv4:127.0.0.1:46557","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_26-3574233828929218433?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_15_22-10864914395343539218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_24_14-17941295763531503748?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state
    job_state = self.result.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5457.715s

FAILED (SKIP=4, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_31-10538048918345982116?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_23_08-18279494439823184592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_32_00-10095291705772363797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_39_07-17772402769129081531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_47_16-8727928572976731026?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_26-10782631789555198357?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_27_52-15371246849196437759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_35_18-10183519755837589871?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_42_51-7572503834744441173?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_29-6409675536318141171?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_21_08-11161903052217858677?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_27_59-18277090722972824160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_34_55-3227680164800428613?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_41_56-14591546552914059238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_26-2527402228156806993?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_26_19-1242774204570543362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_37_21-2500650630422322348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_44_42-17137042038441455457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_27-7994979414314768428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_15_24-8071347589328267747?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_23_20-1189328669042848016?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_31_10-13933813823580424712?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_37_43-18183970155529621147?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_44_36-14652658940833495218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_51_55-4331248548378334299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_58_53-17959185471643157589?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_27-1235395565956160593?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_16_22-13281593651761028124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_23_28-16587583963314374055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_30_48-14698129767381285165?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_37_43-13233165311735923557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_45_08-8881022462283193611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_52_07-3659339658943925432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_08_27-13086458619829032261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_17_26-1831216324167838634?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_27_55-15418030159948988414?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-26_16_45_11-9455348395258459936?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 40s
118 actionable tasks: 92 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/4ynf2x46qkaqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1083

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1083/display/redirect>

Changes:


------------------------------------------
[...truncated 955.42 KB...]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-30d3c357-5ed8-4f6e-a729-d90b8d6d6ece
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:38927
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 22321 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 4, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 9, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 4, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 9, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}))
[flink-runner-job-invoker] ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_904f9783-5c6a-4636-9d20-69fb664dc720","basePath":"/tmp/beam-artifact-staging"}: {}
java.io.FileNotFoundException: /tmp/beam-artifact-staging/job_904f9783-5c6a-4636-9d20-69fb664dc720/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5524.061s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 21s
115 actionable tasks: 89 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/odrjgqtc6jnhi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1082

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1082/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] move enableSparkMetricSinks option to common spark pipeline


------------------------------------------
[...truncated 737.38 KB...]
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/11/26 13:08:17 INFO datanode.webhdfs: 192.168.224.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/11/26 13:08:17 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=192.168.224.3:50010 for /kinglear.txt
datanode_1  | 19/11/26 13:08:17 INFO datanode.DataNode: Receiving BP-1254819375-192.168.224.2-1574773647488:blk_1073741825_1001 src: /192.168.224.3:50988 dest: /192.168.224.3:50010
datanode_1  | 19/11/26 13:08:17 INFO DataNode.clienttrace: src: /192.168.224.3:50988, dest: /192.168.224.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-174512476_67, offset: 0, srvID: fe78e43c-73b8-4c41-aa14-fa8b56e93dbe, blockid: BP-1254819375-192.168.224.2-1574773647488:blk_1073741825_1001, duration: 18428685
datanode_1  | 19/11/26 13:08:17 INFO datanode.DataNode: PacketResponder: BP-1254819375-192.168.224.2-1574773647488:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/26 13:08:17 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/11/26 13:08:17 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/11/26 13:08:17 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-174512476_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | /usr/local/lib/python2.7/site-packages/apache_beam/__init__.py:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
test_1      |   'You are using Apache Beam with Python 2. '
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fdf7a7265d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fdf7a7266d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fdf7a726750> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fdf7a7267d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fdf7a726850> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fdf7a726950> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fdf7a7269d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fdf7a726a50> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fdf7a726ad0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fdf7a726c50> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fdf7a726cd0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fdf7a726d50> ====================
test_1      | INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fdf7a6533d0> for environment urn: "beam:env:embedded_python:v1"
test_1      | 
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/11/26 13:08:20 INFO datanode.webhdfs: 192.168.224.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:apache_beam.io.hadoopfilesystem:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/11/26 13:08:23 INFO datanode.webhdfs: 192.168.224.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-d16e6958104d11ea9b6e0242c0a8e004/a1a0464d-1401-4182-b951-65d6e242316f.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/11/26 13:08:23 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=192.168.224.3:50010 for /beam-temp-py-wordcount-integration-d16e6958104d11ea9b6e0242c0a8e004/a1a0464d-1401-4182-b951-65d6e242316f.py-wordcount-integration
datanode_1  | 19/11/26 13:08:23 INFO datanode.DataNode: Receiving BP-1254819375-192.168.224.2-1574773647488:blk_1073741826_1002 src: /192.168.224.3:51022 dest: /192.168.224.3:50010
datanode_1  | 19/11/26 13:08:23 INFO DataNode.clienttrace: src: /192.168.224.3:51022, dest: /192.168.224.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-671377419_69, offset: 0, srvID: fe78e43c-73b8-4c41-aa14-fa8b56e93dbe, blockid: BP-1254819375-192.168.224.2-1574773647488:blk_1073741826_1002, duration: 4394725
datanode_1  | 19/11/26 13:08:23 INFO datanode.DataNode: PacketResponder: BP-1254819375-192.168.224.2-1574773647488:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/26 13:08:23 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-d16e6958104d11ea9b6e0242c0a8e004/a1a0464d-1401-4182-b951-65d6e242316f.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-671377419_69
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python2-1082_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python2-1082_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1082_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1082_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python2-1082_namenode_1 ... done
Aborting on container exit...

real	1m36.511s
user	0m1.310s
sys	0m0.170s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python2-1082 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python2-1082_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1082_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1082_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1082_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1082_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1082_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python2-1082_test_net

real	0m0.886s
user	0m0.610s
sys	0m0.177s

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
Using default tag: latest
latest: Pulling from library/mongo
Digest: sha256:1a9478d8188d6be31dd2e8de076d402edf20446e54933aad7ff49f5b457d486c
Status: Image is up to date for mongo:latest
2178cf5939a763bd97041f7708132109db2ae4fd58ed3e18800cd49f95c12a40
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.25.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.16.5)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.9.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.11.0)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.8)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.6.6)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.15.1)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.7.0,>=0.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.3)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (4.6.6)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.30.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.18.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.18.0.dev0) (42.0.1)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.18.0.dev0) (2.4.5)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.18.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.2)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.1.7)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.23)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2019.9.11)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.8)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.18.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Writing 100000 documents to mongodb
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:70: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size)
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7f484f7ed8c0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7f484f7ed9b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f484f7eda28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7f484f7edaa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f484f7edb18> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f484f7edc08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f484f7edc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f484f7edcf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f484f7edd70> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f484f7eded8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f484f7edf50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f484ff86050> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f484ff0b190> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_AppliedPTransform_Create documents/Read_3)+((ref_AppliedPTransform_WriteToMongoDB/ParDo(_GenerateObjectIdFn)_5)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:apache_beam.runners.portability.fn_api_runner:Running ((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16))
INFO:__main__:Writing 100000 documents to mongodb finished in 53.899 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1574773731
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:85: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7f484f7ed8c0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7f484f7ed9b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f484f7eda28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7f484f7edaa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f484f7edb18> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f484f7edc08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f484f7edc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f484f7edcf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f484f7edd70> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f484f7eded8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f484f7edf50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f484ff86050> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f484513a8d0> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/0))))))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/1)))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 22.162 seconds
mongoioit27625
mongoioit27625

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 48s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/g4carokyo5vje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1081

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1081/display/redirect>

Changes:


------------------------------------------
[...truncated 736.80 KB...]
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/11/26 12:10:43 INFO datanode.webhdfs: 172.22.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/11/26 12:10:43 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.22.0.3:50010 for /kinglear.txt
datanode_1  | 19/11/26 12:10:43 INFO datanode.DataNode: Receiving BP-1466968368-172.22.0.2-1574770193826:blk_1073741825_1001 src: /172.22.0.3:35146 dest: /172.22.0.3:50010
datanode_1  | 19/11/26 12:10:43 INFO DataNode.clienttrace: src: /172.22.0.3:35146, dest: /172.22.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1003857736_67, offset: 0, srvID: ff4c34e4-3763-4e9b-9641-019d2c831f58, blockid: BP-1466968368-172.22.0.2-1574770193826:blk_1073741825_1001, duration: 17647373
datanode_1  | 19/11/26 12:10:43 INFO datanode.DataNode: PacketResponder: BP-1466968368-172.22.0.2-1574770193826:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/26 12:10:43 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/11/26 12:10:43 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/11/26 12:10:44 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_1003857736_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | /usr/local/lib/python2.7/site-packages/apache_beam/__init__.py:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
test_1      |   'You are using Apache Beam with Python 2. '
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fbc2133a5d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fbc2133a6d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fbc2133a750> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fbc2133a7d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fbc2133a850> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fbc2133a950> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fbc2133a9d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fbc2133aa50> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fbc2133aad0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fbc2133ac50> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fbc2133acd0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fbc2133ad50> ====================
test_1      | INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fbc21266590> for environment urn: "beam:env:embedded_python:v1"
test_1      | 
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/11/26 12:10:47 INFO datanode.webhdfs: 172.22.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:apache_beam.io.hadoopfilesystem:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/11/26 12:10:49 INFO datanode.webhdfs: 172.22.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-c723db66104511eaa1170242ac160004/9fe72919-3dde-4685-8327-773803465309.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/11/26 12:10:50 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.22.0.3:50010 for /beam-temp-py-wordcount-integration-c723db66104511eaa1170242ac160004/9fe72919-3dde-4685-8327-773803465309.py-wordcount-integration
datanode_1  | 19/11/26 12:10:50 INFO datanode.DataNode: Receiving BP-1466968368-172.22.0.2-1574770193826:blk_1073741826_1002 src: /172.22.0.3:35166 dest: /172.22.0.3:50010
datanode_1  | 19/11/26 12:10:50 INFO DataNode.clienttrace: src: /172.22.0.3:35166, dest: /172.22.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-921538414_69, offset: 0, srvID: ff4c34e4-3763-4e9b-9641-019d2c831f58, blockid: BP-1466968368-172.22.0.2-1574770193826:blk_1073741826_1002, duration: 5148789
datanode_1  | 19/11/26 12:10:50 INFO datanode.DataNode: PacketResponder: BP-1466968368-172.22.0.2-1574770193826:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/26 12:10:50 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-c723db66104511eaa1170242ac160004/9fe72919-3dde-4685-8327-773803465309.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-921538414_69
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python2-1081_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python2-1081_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1081_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1081_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python2-1081_namenode_1 ... done
Aborting on container exit...

real	1m36.592s
user	0m1.402s
sys	0m0.187s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python2-1081 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python2-1081_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1081_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1081_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1081_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1081_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1081_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python2-1081_test_net

real	0m0.843s
user	0m0.662s
sys	0m0.099s

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
Using default tag: latest
latest: Pulling from library/mongo
Digest: sha256:1a9478d8188d6be31dd2e8de076d402edf20446e54933aad7ff49f5b457d486c
Status: Image is up to date for mongo:latest
aa465902db66950586df626a4e7d53a3d304afcb74448218fe2c33b879747948
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.25.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.16.5)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.9.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.11.0)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.8)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.6.6)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.15.1)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.7.0,>=0.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.3)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (4.6.6)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.30.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.18.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.18.0.dev0) (42.0.1)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.18.0.dev0) (2.4.5)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.18.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.2)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.1.7)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.23)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2019.9.11)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.8)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.18.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Writing 100000 documents to mongodb
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:70: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size)
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fc15f4578c0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fc15f4579b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fc15f457a28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fc15f457aa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fc15f457b18> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fc15f457c08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fc15f457c80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fc15f457cf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fc15f457d70> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fc15f457ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fc15f457f50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fc15fb6c050> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fc15e6fe890> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_AppliedPTransform_Create documents/Read_3)+((ref_AppliedPTransform_WriteToMongoDB/ParDo(_GenerateObjectIdFn)_5)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:apache_beam.runners.portability.fn_api_runner:Running ((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16))
INFO:__main__:Writing 100000 documents to mongodb finished in 54.938 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1574770277
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:85: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fc15f4578c0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fc15f4579b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fc15f457a28> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fc15f457aa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fc15f457b18> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fc15f457c08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fc15f457c80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fc15f457cf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fc15f457d70> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fc15f457ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fc15f457f50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fc15fb6c050> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fc15f2ed190> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 22.390 seconds
mongoioit28008
mongoioit28008

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 55s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/k27amwv32r66c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1080

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1080/display/redirect>

Changes:


------------------------------------------
[...truncated 1.02 MB...]
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "encode.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s8"
        }, 
        "serialized_fn": "ref_AppliedPTransform_encode_11", 
        "user_name": "encode"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s10", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s9"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_output6a24d7d0-7f89-4a19-b1f3-ff871d3f3880", 
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-11-26T06:51:54.971216Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-11-25_22_51_52-13798332160415640530'
 location: u'us-central1'
 name: u'beamapp-jenkins-1126065140-480327'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-11-26T06:51:54.971216Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-11-25_22_51_52-13798332160415640530]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_22_51_52-13798332160415640530?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-11-25_22_51_52-13798332160415640530 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:57.210Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:57.972Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.523Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.525Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.534Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.544Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.547Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.552Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.571Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.574Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.577Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.579Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.581Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.584Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.586Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.588Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.590Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.592Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.594Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.602Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.614Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.626Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.763Z: JOB_MESSAGE_DEBUG: Executing wait step start2
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.814Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:51:58.819Z: JOB_MESSAGE_BASIC: Starting 1 workers...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:52:01.294Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:52:01.294Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:52:26.924Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:52:27.704Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:52:27.704Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:52:28.378Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-11-26T06:52:43.330Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2019-11-25_22_51_52-13798332160415640530 after 364 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5466.405s

FAILED (SKIP=4, errors=2)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 23s
115 actionable tasks: 90 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/d3by7re75raj6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1079

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1079/display/redirect?page=changes>

Changes:

[suztomo] Dataflow Java worker to avoid undeclared Guava

[suztomo] Beam SQL JDBC driver not to declare unused Guava

[suztomo] KinesisIO to declare Guava dependency

[suztomo] ZetaSQL to declare Guava dependency

[suztomo] Removed unused dependency from elasticsearch-tests-2


------------------------------------------
[...truncated 1021.01 KB...]
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_11_56-14407833482298394730?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_19_57-11531778054802013703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_28_07-16221463171951837463?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 620, in __ProcessHttpResponse
    return self.__client.DeserializeMessage(response_type, content)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 446, in DeserializeMessage
    message = encoding.JsonToMessage(response_type, data)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 123, in JsonToMessage
    return _ProtoJsonApiTools.Get().decode_message(message_type, message)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 287, in __decode_dictionary
    for item in value]
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 331, in decode_field
    field.message_type, json.dumps(value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 309, in decode_message
    message_type, result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py",> line 294, in __decode_dictionary
    setattr(message, field.name, self.decode_field(field, value))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py",> line 337, in decode_field
    _ProtoJsonApiTools, self).decode_field(field, value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5477.387s

FAILED (SKIP=4, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_12_00-8650058752289145724?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_26_39-4635031006746626903?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_35_11-3836069188381695551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_42_19-4362372453122920065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_50_19-6495909074585449329?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_11_56-6604251185357258514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_31_26-5047514647572271854?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_38_32-15387969934083406131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_45_19-10166483175491184624?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_11_58-2360547933856540298?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_24_23-8695241411392508680?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_31_09-6241397758001627855?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_37_54-8045538797945216599?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_44_31-4188515730771494252?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_11_56-9039593046929732275?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_28_18-17616235953180805009?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_36_32-6278256398052836704?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_43_12-12686763093706684210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_11_56-5422662969281867268?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_18_46-8532276333576310861?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_27_48-1742266138045417794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_35_03-12720180950977194065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_42_09-6304404067766894147?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_49_30-4023000938723744475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_11_59-744809211503559640?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_19_55-6528594596176977278?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_27_26-2281431329877376910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_34_15-12473066804919803339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_40_53-10022703341946200966?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_48_33-9609903549015779538?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_56_01-9850078868805040995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_22_02_50-1546906945855537658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_11_56-9029152524988724706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_20_42-2415491137126462845?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_31_09-16201370835283138454?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_49_06-14622598646417634316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_21_55_50-14021583808679180255?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 49s
115 actionable tasks: 91 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/xml4faqte2pe2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1078

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1078/display/redirect>

Changes:


------------------------------------------
[...truncated 506.43 KB...]
WARNING:apache_beam.io.filebasedsink:Deleting 2 existing files in target path matching: -*-of-%(num_shards)05d
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (0ba5cb2f808a5e09dadae4c99d2b0b8e) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (0ba5cb2f808a5e09dadae4c99d2b0b8e).
[[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (0ba5cb2f808a5e09dadae4c99d2b0b8e) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem 0ba5cb2f808a5e09dadae4c99d2b0b8e.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (c059def31710a805b936b4ba4b229e28) switched from RUNNING to FINISHED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (c059def31710a805b936b4ba4b229e28).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (c059def31710a805b936b4ba4b229e28) [FINISHED]
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (2d017e9cb43d21bdc0ad743f7a2abdd7) switched from RUNNING to FINISHED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (2d017e9cb43d21bdc0ad743f7a2abdd7).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (2d017e9cb43d21bdc0ad743f7a2abdd7) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map c059def31710a805b936b4ba4b229e28.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 2d017e9cb43d21bdc0ad743f7a2abdd7.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/FinalizeWrite (2/2) (8874ba617429609367b5af3fe515179d) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/FinalizeWrite (2/2) (8874ba617429609367b5af3fe515179d).
[[1]write/Write/WriteImpl/FinalizeWrite (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/FinalizeWrite (2/2) (8874ba617429609367b5af3fe515179d) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/FinalizeWrite 8874ba617429609367b5af3fe515179d.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) (0ba5cb2f808a5e09dadae4c99d2b0b8e) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) (c059def31710a805b936b4ba4b229e28) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) (2d017e9cb43d21bdc0ad743f7a2abdd7) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/FinalizeWrite (2/2) (8874ba617429609367b5af3fe515179d) switched from RUNNING to FINISHED.
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 2 (skipped: 0), batches: 2, num_threads: 2
INFO:apache_beam.io.filebasedsink:Renamed 2 shards in 0.12 seconds.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Closing environment urn: "beam:env:external:v1"
payload: "\n\021\022\017localhost:41643"

INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight requests to complete
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers.
[grpc-default-executor-1] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]write/Write/WriteImpl/FinalizeWrite (1/2) (ada6eaaa940c48907cbf3ac60b9873c9) switched from RUNNING to FINISHED.
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]write/Write/WriteImpl/FinalizeWrite (1/2) (ada6eaaa940c48907cbf3ac60b9873c9).
[[1]write/Write/WriteImpl/FinalizeWrite (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]write/Write/WriteImpl/FinalizeWrite (1/2) (ada6eaaa940c48907cbf3ac60b9873c9) [FINISHED]
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]write/Write/WriteImpl/FinalizeWrite ada6eaaa940c48907cbf3ac60b9873c9.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]write/Write/WriteImpl/FinalizeWrite (1/2) (ada6eaaa940c48907cbf3ac60b9873c9) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-jenkins-1126001824-6deffa00 (cbc6420a8af891c422617f32e7ef3e88) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint coordinator for job cbc6420a8af891c422617f32e7ef3e88.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore - Shutting down
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job cbc6420a8af891c422617f32e7ef3e88 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-jenkins-1126001824-6deffa00(cbc6420a8af891c422617f32e7ef3e88).
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8135}, allocationId: 0774668ca6f7d413e07262423dd5a120, jobId: cbc6420a8af891c422617f32e7ef3e88).
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 3347c963cab48ff466b47e5831473063: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager a3695387fede4d9e33fb029c0f124dff@akka://flink/user/jobmanager_1 for job cbc6420a8af891c422617f32e7ef3e88 from the resource manager.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[mini-cluster-io-thread-14] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job cbc6420a8af891c422617f32e7ef3e88 with leader id a3695387fede4d9e33fb029c0f124dff lost leadership.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8135}, allocationId: 714bf7ffc839874277e7e9f328d00e7e, jobId: cbc6420a8af891c422617f32e7ef3e88).
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job cbc6420a8af891c422617f32e7ef3e88 from job leader monitoring.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job cbc6420a8af891c422617f32e7ef3e88.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job cbc6420a8af891c422617f32e7ef3e88.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job cbc6420a8af891c422617f32e7ef3e88 because it is not registered.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection 3347c963cab48ff466b47e5831473063.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection bbd050df-ab8d-4a66-ac6d-896bc4d004af because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-accd3c2d-d753-4181-a52e-7fd9872e46b4
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-601d49ea-c7a7-41c5-8ec9-1b0d0f6ad4fb
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-4d9cf9ef-c2aa-4762-a449-a8e2366d4207
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-15] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44123
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 61553 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 1, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=words}: 96, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 27, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 44, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 96, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 96, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 44, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2532>)_30}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 2, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: 2, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=empty_lines}: 2, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 2, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 2, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_lengths}: 298, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 2, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 0, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0)Distributions(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=772, count=38, min=18, max=27}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=801, count=34, min=20, max=29}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=672, count=35, min=17, max=23}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=14, count=1, min=14, max=14}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=685, count=1, min=685, max=685}, 17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=106, count=2, min=53, max=53}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=774, count=36, min=19, max=28}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=276, count=2, min=138, max=138}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=278, count=2, min=139, max=139}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=13, count=1, min=13, max=13}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user_distribution {NAMESPACE=__main__.WordExtractingDoFn, PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_len_dist}: DistributionResult{sum=298, count=96, min=1, max=10}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=81, count=1, min=81, max=81}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=276, count=2, min=138, max=138}, 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=271, count=1, min=271, max=271}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=671, count=18, min=14, max=84}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=631, count=38, min=14, max=23}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=742, count=40, min=16, max=25}, 36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=685, count=1, min=685, max=685}, 40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, count=1, min=15, max=15}, 6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: DistributionResult{sum=278, count=2, min=139, max=139}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-tempCtm23e/artifactsBdGPi1/job_9324299b-9b97-48b8-93b5-c423abc83134/MANIFEST has 1 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-tempCtm23e/artifactsBdGPi1/job_9324299b-9b97-48b8-93b5-c423abc83134/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting job metrics for BeamApp-jenkins-1126001824-6deffa00_4b2dd7a3-16f8-49bf-9107-4f7d901d1622
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished getting job metrics for BeamApp-jenkins-1126001824-6deffa00_4b2dd7a3-16f8-49bf-9107-4f7d901d1622
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.runners.portability.fn_api_runner_transforms"
19/11/26 00:19:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44447
19/11/26 00:19:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:41551
19/11/26 00:19:35 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService started on localhost:46387
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/11/26 00:19:36 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1126001936-22aabbab_2b43ba5b-0cd1-428b-af95-48938591410b
19/11/26 00:19:36 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1126001936-22aabbab_2b43ba5b-0cd1-428b-af95-48938591410b
INFO:root:Waiting until the pipeline has finished because the environment "LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/26 00:19:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/26 00:19:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/26 00:19:37 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
19/11/26 00:19:37 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/11/26 00:19:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1126001936-22aabbab_2b43ba5b-0cd1-428b-af95-48938591410b on Spark master local[4]
19/11/26 00:19:38 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: 
19/11/26 00:19:38 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:33953.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
19/11/26 00:19:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:33881.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:34211
19/11/26 00:19:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:19:41 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/11/26 00:20:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1126001936-22aabbab_2b43ba5b-0cd1-428b-af95-48938591410b: Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/11/26 00:20:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1126001936-22aabbab_2b43ba5b-0cd1-428b-af95-48938591410b finished.
19/11/26 00:20:39 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 00:20:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempB1pEKD/artifacts59wWpd/job_ea61efd4-92b5-4e17-937e-70c314c180af/MANIFEST has 1 artifact locations
19/11/26 00:20:39 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempB1pEKD/artifacts59wWpd/job_ea61efd4-92b5-4e17-937e-70c314c180af/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/11/26 00:20:39 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting job metrics for BeamApp-jenkins-1126001936-22aabbab_2b43ba5b-0cd1-428b-af95-48938591410b
19/11/26 00:20:39 INFO org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1126001936-22aabbab_2b43ba5b-0cd1-428b-af95-48938591410b
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574727640.032714483","description":"Error received from peer ipv4:127.0.0.1:34211","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574727640.032714483","description":"Error received from peer ipv4:127.0.0.1:34211","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574727640.032809549","description":"Error received from peer ipv4:127.0.0.1:33953","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 532, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1574727640.032738717","description":"Error received from peer ipv4:127.0.0.1:33881","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 18s
112 actionable tasks: 86 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/eveaxri7tjxim

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1077

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1077/display/redirect?page=changes>

Changes:

[thw] [BEAM-8815] Skip manifest when no artifacts are staged


------------------------------------------
[...truncated 766.87 KB...]
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python2-1077 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python2-1077_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1077_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1077_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1077_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1077_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1077_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python2-1077_test_net

real	0m0.873s
user	0m0.615s
sys	0m0.166s

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
Using default tag: latest
latest: Pulling from library/mongo
7ddbc47eeb70: Pulling fs layer
c1bbdc448b72: Pulling fs layer
8c3b70e39044: Pulling fs layer
45d437916d57: Pulling fs layer
e119fb0e0a55: Pulling fs layer
91f0b9bae1ea: Pulling fs layer
53e7c2967f11: Pulling fs layer
69a945568374: Pulling fs layer
93333bc225a7: Pulling fs layer
b9c10bd6c9bd: Pulling fs layer
7f4e3538e99c: Pulling fs layer
1164b51d180a: Pulling fs layer
a715a7d71f27: Pulling fs layer
91f0b9bae1ea: Waiting
53e7c2967f11: Waiting
7f4e3538e99c: Waiting
1164b51d180a: Waiting
69a945568374: Waiting
b9c10bd6c9bd: Waiting
93333bc225a7: Waiting
e119fb0e0a55: Waiting
45d437916d57: Waiting
8c3b70e39044: Verifying Checksum
8c3b70e39044: Download complete
c1bbdc448b72: Verifying Checksum
c1bbdc448b72: Download complete
45d437916d57: Download complete
7ddbc47eeb70: Verifying Checksum
7ddbc47eeb70: Download complete
e119fb0e0a55: Download complete
69a945568374: Download complete
91f0b9bae1ea: Verifying Checksum
91f0b9bae1ea: Download complete
53e7c2967f11: Download complete
93333bc225a7: Download complete
b9c10bd6c9bd: Verifying Checksum
b9c10bd6c9bd: Download complete
a715a7d71f27: Verifying Checksum
a715a7d71f27: Download complete
1164b51d180a: Verifying Checksum
1164b51d180a: Download complete
7ddbc47eeb70: Pull complete
c1bbdc448b72: Pull complete
8c3b70e39044: Pull complete
45d437916d57: Pull complete
7f4e3538e99c: Verifying Checksum
7f4e3538e99c: Download complete
e119fb0e0a55: Pull complete
91f0b9bae1ea: Pull complete
53e7c2967f11: Pull complete
69a945568374: Pull complete
93333bc225a7: Pull complete
b9c10bd6c9bd: Pull complete
7f4e3538e99c: Pull complete
1164b51d180a: Pull complete
a715a7d71f27: Pull complete
Digest: sha256:1a9478d8188d6be31dd2e8de076d402edf20446e54933aad7ff49f5b457d486c
Status: Downloaded newer image for mongo:latest
154cf749957fdc16d006e02392130151b3fc1a16de3e4ebb8d380f63ffca9249
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.25.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.16.5)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.9.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.10.0)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.8)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.6.6)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.15.1)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.7.0,>=0.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.6.3)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (4.6.6)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.30.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.18.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.18.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.18.0.dev0) (42.0.1)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.18.0.dev0) (2.4.5)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.18.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.2)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.1.7)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.23)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2019.9.11)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.18.0.dev0) (2.8)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.18.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.18.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Writing 100000 documents to mongodb
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:70: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size)
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fe5afe26aa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fe5afe26b90> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fe5afe26c08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fe5afe26c80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fe5afe26cf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fe5afe26de8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fe5afe26e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fe5afe26ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fe5afe26f50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fe5afe1f140> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fe5afe1f1b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fe5afe1f230> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fe5afbd6110> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_AppliedPTransform_Create documents/Read_3)+((ref_AppliedPTransform_WriteToMongoDB/ParDo(_GenerateObjectIdFn)_5)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:apache_beam.runners.portability.fn_api_runner:Running ((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16))
INFO:__main__:Writing 100000 documents to mongodb finished in 53.306 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1574725561
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:85: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fe5afe26aa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fe5afe26b90> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fe5afe26c08> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fe5afe26c80> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fe5afe26cf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fe5afe26de8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fe5afe26e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fe5afe26ed8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fe5afe26f50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fe5afe1f140> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fe5afe1f1b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fe5afe1f230> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fe5ac45e810> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:__main__:Read 100000 documents from mongodb finished in 21.984 seconds
mongoioit27373
mongoioit27373

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 36s
111 actionable tasks: 88 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/hqaegirzh4xj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1076

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1076/display/redirect?page=changes>

Changes:

[worldkzd] fix typos

[github] Update class_test.go

[worldkzd] keep 'a https'


------------------------------------------
[...truncated 987.54 KB...]
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 4, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 9, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 6, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}: 9, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 4, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 6, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda at external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2532>)_26}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}: 0, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, 46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at /tmp/beam-artifact-staging/job_355d330c-ad01-49c9-b11b-cad846d2e414/MANIFEST has 0 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-artifact-staging/job_355d330c-ad01-49c9-b11b-cad846d2e414/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_02_00-17175749708108419317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_09_38-6078996102992964609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_18_40-6100207585485348868?project=apache-beam-testing
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5505.068s

FAILED (SKIP=4, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_02_01-12480845351670503316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_17_10-18037608479016754692?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_25_51-6703553877765923653?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_33_52-4312188965287390069?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_42_00-314833254344199215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_02_02-7984012451038908912?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_19_36-2395907778577121256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_28_10-6122018374352853725?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_35_39-4866696443496679201?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_02_07-1887967002173676910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_14_26-12937155990582376496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_22_02-11127373026559958994?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_29_08-3502218073268092987?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_36_11-10898719254966749755?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_01_59-4812488104152396642?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_19_47-11589449111030283626?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_37_41-10124703149451649830?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_45_15-13227333762121213649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_02_01-9048646026052595404?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_09_15-379314293312791164?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_18_24-8098970727515184140?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_26_02-9473186242079099248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_33_39-13273683186549683818?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_41_29-10001149289026370680?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_02_00-6054872482878101897?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_09_42-6432388407708125032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_18_04-16161851145854577238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_25_34-2495211788777881365?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_32_38-12929985269998151413?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_40_33-2382787299595832127?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_48_03-5048831441159251743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_54_32-14248825169490344502?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_02_00-6590654235363953674?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_11_06-11005169695106435089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_20_58-4780934228575794255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_28_48-17252464416312391958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_14_36_29-9127299074500352470?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 52s
115 actionable tasks: 90 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/7qmrsdli7bpzq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1075

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1075/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8575] Added a unit test that Reshuffle preserves timestamps


------------------------------------------
[...truncated 1.18 MB...]
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 10.0525794739 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:leader_board_it_dataset15747137399503.leader_board_teams was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 104, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query SELECT total_score FROM `apache-beam-testing.leader_board_it_dataset15747137399503.leader_board_teams` WHERE total_score=5000 LIMIT 1 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 28.8042493379 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:leader_board_it_dataset15747137399503.leader_board_teams was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 104, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query SELECT total_score FROM `apache-beam-testing.leader_board_it_dataset15747137399503.leader_board_teams` WHERE total_score=5000 LIMIT 1 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 46.7499349626 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:leader_board_it_dataset15747137399503.leader_board_teams was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 104, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query SELECT total_score FROM `apache-beam-testing.leader_board_it_dataset15747137399503.leader_board_teams` WHERE total_score=5000 LIMIT 1 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel
    return self.state
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state
    self._update_job()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2133, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1796, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1737, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1152, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 463, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 419, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5472.222s

FAILED (SKIP=4, errors=2)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_29_13-14516847737209413729?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_44_32-345969695028145750?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_53_44-8743722607344669058?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_01_33-7832626888857934248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_29_11-18420676879381138028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_48_59-15288751772820479650?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_55_50-16686221662168146955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_03_05-3818161575777991360?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_10_16-15254303439356728821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_29_19-12539989133404855887?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_43_37-16804629650037026513?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_51_04-68799162215403857?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_57_29-16323906184674287116?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_04_46-17819199071599831869?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_29_11-16544353463729294448?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_48_25-15798735178211352409?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_57_33-5379624597530367955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_04_32-12080714991342472984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_29_10-8984676535918978581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_36_26-14638343431196103743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_45_10-11372960347380848594?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_52_31-5163372502392632735?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_59_21-16735691517892246375?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_07_18-3254581240820738622?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_29_10-5313047059110755593?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_37_01-844358055027388716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_45_12-10376745249135821257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_52_26-13233738158551690139?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_59_16-308177571421449120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_07_07-5924613234383162937?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_14_58-10366488567852244332?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_21_31-14220617350469293514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_29_10-16150732323144515037?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_38_15-1369021748211919432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_12_48_41-2333921826536808041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_05_58-18018849790589197878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-25_13_13_43-11935778133359205292?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 37s
115 actionable tasks: 90 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/otgarwv3kjiig

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1074

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1074/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-876] Support schemaUpdateOption in BigQueryIO (#9524)


------------------------------------------
[...truncated 2.62 MB...]
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 102.163594739 seconds before retrying copy because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:09:08 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 254, in copy
    response = self.client.objects.Rewrite(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1234, in Rewrite
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 310.380292984 seconds before retrying copy because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:10:50 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 254, in copy
    response = self.client.objects.Rewrite(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1234, in Rewrite
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-d64fd15a-b34a-4a61-991b-86aa925766ab/'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-d64fd15a-b34a-4a61-991b-86aa925766ab/*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/gcs\\_it\\-d64fd15a\\-b34a\\-4a61\\-991b\\-86aa925766ab\\/[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 3.22382130938 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:16:01 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 7.08987359944 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:16:04 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 19.9291221469 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:16:11 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 23.2880828336 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:16:31 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 57.5468576541 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:16:54 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 100.331790488 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:17:52 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 216.877854372 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 20:19:32 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4257.187s

FAILED (SKIP=1, errors=52)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 12m 26s
115 actionable tasks: 89 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/mbibmnnhcryf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1073

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1073/display/redirect?page=changes>

Changes:

[kirillkozlov] Fix MongoDb SQL Integration Tests

[kirillkozlov] Add MongoDbIT back to build file

[kirillkozlov] Update JavaDoc comment and remove pipeline options


------------------------------------------
[...truncated 3.06 MB...]
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 254, in copy
    response = self.client.objects.Rewrite(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1234, in Rewrite
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 85.4753464716 seconds before retrying copy because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:01:17 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 254, in copy
    response = self.client.objects.Rewrite(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1234, in Rewrite
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 198.483347531 seconds before retrying copy because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:02:42 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 254, in copy
    response = self.client.objects.Rewrite(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1234, in Rewrite
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-e12d3773-b12e-4418-ae59-183303b5895b/'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-e12d3773-b12e-4418-ae59-183303b5895b/*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/gcs\\_it\\-e12d3773\\-b12e\\-4418\\-ae59\\-183303b5895b\\/[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 4.54718675221 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:06:01 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 6.19155288267 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:06:06 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 12.2239764145 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:06:12 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 31.4652996565 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:06:24 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 78.4589324961 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:06:55 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 134.730306993 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:08:14 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 213.509157403 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 19:10:29 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4733.442s

FAILED (SKIP=1, errors=54)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 10s
118 actionable tasks: 92 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/wpvfssqloxpqi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1072

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1072/display/redirect>

Changes:


------------------------------------------
[...truncated 2.63 MB...]
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 86.6510451424 seconds before retrying copy because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:07:04 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 254, in copy
    response = self.client.objects.Rewrite(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1234, in Rewrite
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 223.010957309 seconds before retrying copy because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:08:31 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 254, in copy
    response = self.client.objects.Rewrite(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1234, in Rewrite
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-ad2b9dd9-2eae-47ec-8842-eab6a3ac2cd5/'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-ad2b9dd9-2eae-47ec-8842-eab6a3ac2cd5/*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/gcs\\_it\\-ad2b9dd9\\-2eae\\-47ec\\-8842\\-eab6a3ac2cd5\\/[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 4.03138313409 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:12:14 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 8.71696059988 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:12:18 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 11.5246082739 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:12:27 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 32.6138174288 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:12:38 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 42.2709383293 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:13:11 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 114.154292913 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:13:53 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 291.960016092 seconds before retrying list_prefix because we caught exception: HttpAccessTokenRefreshError: Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token from the Google Compute Enginemetadata service. Response:
{'status': '404', 'content-length': '129', 'x-xss-protection': '0', 'metadata-flavor': 'Google', 'server': 'Metadata Server for VM', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Mon, 25 Nov 2019 13:15:48 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py",> line 466, in list_prefix
    response = self.client.objects.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1182, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 356, in MakeRequest
    max_retry_wait, total_wait_sec))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/gcsio_overrides.py",> line 45, in retry_func
    return http_wrapper.HandleExceptionsAndRebuildHttpConnections(retry_args)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc

apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4719.163s

FAILED (SKIP=1, errors=54)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 19s
115 actionable tasks: 89 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/vmm3grma224a4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org