You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/01/08 00:21:37 UTC

beam_PostCommit_PortableJar_Flink - Build # 1060 - Aborted

The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1060)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1060/ to view the results.

Jenkins build is back to normal : beam_PostCommit_PortableJar_Flink #1075

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1075/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_PortableJar_Flink - Build # 1074 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1074)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1074/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1073 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1073)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1073/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1072 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1072)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1072/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1071 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1071)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1071/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1070 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1070)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1070/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1069 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1069)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1069/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1068 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1068)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1068/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1067 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1067)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1067/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1066 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1066)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1066/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1065 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1065)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1065/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1064 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1064)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1064/ to view the results.

beam_PostCommit_PortableJar_Flink - Build # 1063 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1063)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1063/ to view the results.

Build failed in Jenkins: beam_PostCommit_PortableJar_Flink #1062

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1062/display/redirect?page=changes>

Changes:

[36090911+boyuanzz] [BEAM-8932] [BEAM-9036] Revert reverted commit to use PubsubMessage as


------------------------------------------
[...truncated 274.30 KB...]
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.close(FlinkExecutableStageFunction.java:291)
	at org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:508)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	at java.lang.Thread.run(Thread.java:748)
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (4e11e2ab9eac26bc005a1381c517f931) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (4e11e2ab9eac26bc005a1381c517f931).
[CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (6abcb925b07544c5fd0b0cfa30762a84) [FINISHED]
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (4e11e2ab9eac26bc005a1381c517f931) [FINISHED]
[CHAIN MapPartition (MapPartition at [1]Create/FlatMap(<lambda at core.py:2593>)) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at [1]Create/FlatMap(<lambda at core.py:2593>)) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (16fbc30fe0d64e7ec453aa168b68d026) [FINISHED]
[Partition (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task Partition (1/1) (b9ffbc28366d940f0106fcae02ebd5a4) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap (FlatMap at ExtractOutput[0]) 6abcb925b07544c5fd0b0cfa30762a84.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) 4e11e2ab9eac26bc005a1381c517f931.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (6abcb925b07544c5fd0b0cfa30762a84) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at [1]Create/FlatMap(<lambda at core.py:2593>)) -> FlatMap (FlatMap at ExtractOutput[0]) 16fbc30fe0d64e7ec453aa168b68d026.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task Partition b9ffbc28366d940f0106fcae02ebd5a4.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:20>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (4e11e2ab9eac26bc005a1381c517f931) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at [1]Create/FlatMap(<lambda at core.py:2593>)) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1) (16fbc30fe0d64e7ec453aa168b68d026) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Partition (1/1) (b9ffbc28366d940f0106fcae02ebd5a4) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (attempt #0) to cecca9c9-82ad-4b48-96d7-88db0d58283e @ localhost (dataPort=-1)
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/1) (b7048051bd97302e0eaa37e82137e201) switched from RUNNING to FINISHED.
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/1) (b7048051bd97302e0eaa37e82137e201).
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/1) (b7048051bd97302e0eaa37e82137e201) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) [DEPLOYING]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) b7048051bd97302e0eaa37e82137e201.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) [DEPLOYING].
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/1) (b7048051bd97302e0eaa37e82137e201) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) [DEPLOYING].
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (attempt #0) to cecca9c9-82ad-4b48-96d7-88db0d58283e @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) c53564c25de51af1bc4a2c48d752bcce.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/1) (c53564c25de51af1bc4a2c48d752bcce) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image apachebeam/python3.7_sdk:2.19.0.dev, cause: Received exit code 1 for command 'docker pull apachebeam/python3.7_sdk:2.19.0.dev'. stderr: Error response from daemon: manifest for apachebeam/python3.7_sdk:2.19.0.dev not found
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - GetManifest for BEAM-PIPELINE/BeamApp-jenkins-0108030651-91e54179/artifact-manifest.json
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - Manifest at BEAM-PIPELINE/BeamApp-jenkins-0108030651-91e54179/artifact-manifest.json has 1 artifact locations
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - GetManifest for BEAM-PIPELINE/BeamApp-jenkins-0108030651-91e54179/artifact-manifest.json -> 1 artifacts
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory - Still waiting for startup of environment apachebeam/python3.7_sdk:2.19.0.dev for worker id 2-1
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory - Still waiting for startup of environment apachebeam/python3.7_sdk:2.19.0.dev for worker id 2-1
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory - Still waiting for startup of environment apachebeam/python3.7_sdk:2.19.0.dev for worker id 2-1
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory - Still waiting for startup of environment apachebeam/python3.7_sdk:2.19.0.dev for worker id 2-1
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:107 - Logging handler created.
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:133 - semi_persistent_directory: /tmp
[grpc-default-executor-0] WARN /usr/local/lib/python3.7/site-packages/apache_beam/options/pipeline_options.py:286 - Discarding unparseable args: ['--app_name=None', '--direct_runner_use_stacked_bundle', '--fail_on_checkpointing_errors', '--job_server_timeout=60', '--options_id=1', '--parallelism=1', '--pipeline_type_check', '--retrieval_service_type=CLASSLOADER'] 
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:145 - Python sdk harness started with pipeline_options: {'runner': 'None', 'job_name': 'BeamApp-jenkins-0108030651-91e54179', 'experiments': ['beam_fn_api'], 'save_main_session': True, 'sdk_location': 'container', 'environment_type': 'DOCKER', 'environment_config': 'apachebeam/python3.7_sdk:2.19.0.dev', 'sdk_worker_parallelism': '1', 'environment_cache_millis': '0', 'output_executable_path': 'flink-test-20200108-030646.jar', 'job_port': '0', 'artifact_port': '0', 'expansion_port': '0', 'flink_job_server_jar': '<https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/ws/src/runners/flink/1.9/job-server/build/libs/beam-runners-flink-1.9-job-server-2.19.0-SNAPSHOT.jar'}>
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/statecache.py:137 - Creating state cache with size 0
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:87 - Creating insecure control channel for localhost:46851.
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:95 - Control channel established.
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:116 - Initializing SDKHarness with unbounded number of workers.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 2-1
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:532 - Creating insecure state channel for localhost:34665.
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:539 - State channel established.
[grpc-default-executor-0] INFO /usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:524 - Creating client data channel for localhost:46183
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Closing environment urn: "beam:env:docker:v1"
payload: "\n#apachebeam/python3.7_sdk:2.19.0.dev"

[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - 1 Beam Fn Logging clients still connected during shutdown.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.beam.runners.fnexecution.environment.DockerContainerEnvironment - Closing Docker container 00e6db7fb6bdfcbeb2797680fadaa3505b6e62376adf07be27734349d7735362. Logs:
2020/01/08 03:08:11 Initializing python harness: /opt/apache/beam/boot --id=2-1 --logging_endpoint=localhost:34395 --artifact_endpoint=localhost:46395 --provision_endpoint=localhost:40769 --control_endpoint=localhost:46851
2020/01/08 03:08:12 Installing setup packages ...
2020/01/08 03:08:12 Found artifact: pickled_main_session
2020/01/08 03:08:12 Executing: python -m apache_beam.runners.worker.sdk_worker_main
2020/01/08 03:08:32 Python exited: <nil>
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] WARN org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Error cleaning up servers urn: "beam:env:docker:v1"
payload: "\n#apachebeam/python3.7_sdk:2.19.0.dev"

java.io.IOException: Received exit code 1 for command 'docker kill 00e6db7fb6bdfcbeb2797680fadaa3505b6e62376adf07be27734349d7735362'. stderr: Error response from daemon: Cannot kill container: 00e6db7fb6bdfcbeb2797680fadaa3505b6e62376adf07be27734349d7735362: No such container: 00e6db7fb6bdfcbeb2797680fadaa3505b6e62376adf07be27734349d7735362
	at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
	at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:168)
	at org.apache.beam.runners.fnexecution.environment.DockerCommand.killContainer(DockerCommand.java:148)
	at org.apache.beam.runners.fnexecution.environment.DockerContainerEnvironment.close(DockerContainerEnvironment.java:93)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:481)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:481)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:496)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$1800(DefaultJobBundleFactory.java:436)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:168)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:258)
	at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.close(FlinkExecutableStageFunction.java:291)
	at org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:508)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	at java.lang.Thread.run(Thread.java:748)
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) [FINISHED]
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 5775301a5b674af5be48baa28ebde854.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (1/1) (attempt #0) to cecca9c9-82ad-4b48-96d7-88db0d58283e @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1) (5775301a5b674af5be48baa28ebde854) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (1/1).
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) [DEPLOYING]
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) [DEPLOYING].
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) [DEPLOYING].
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4).
[DataSink (DiscardingOutput) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) [FINISHED]
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) e83df8663c3f36ffbdb5d15d2185cbb4.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/1) (e83df8663c3f36ffbdb5d15d2185cbb4) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-jenkins-0108030651-91e54179 (324b1e1c73be58c0df718a8f14a97101) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 324b1e1c73be58c0df718a8f14a97101 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-jenkins-0108030651-91e54179(324b1e1c73be58c0df718a8f14a97101).
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=16278}, allocationId: ada4f093caf642fc6c7f34d5d7c8f6fe, jobId: 324b1e1c73be58c0df718a8f14a97101).
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection a284bc631a256d1d9525dc01144df9d8: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 324b1e1c73be58c0df718a8f14a97101 from job leader monitoring.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 9bdd4ddd0d09ba72651f106ec9984891@akka://flink/user/jobmanager_1 for job 324b1e1c73be58c0df718a8f14a97101 from the resource manager.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 324b1e1c73be58c0df718a8f14a97101.
[main] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[main] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 324b1e1c73be58c0df718a8f14a97101.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job 324b1e1c73be58c0df718a8f14a97101 because it is not registered.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection a284bc631a256d1d9525dc01144df9d8.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection cecca9c9-82ad-4b48-96d7-88db0d58283e because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-aa80af03-d3e0-41e6-ae3e-077c03dda8f1
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-012117dd-1170-4da9-a24b-38571e59226e
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-69c41440-830f-4792-9e73-0e2bd42a3d07
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-2] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-17] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-16] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:45741
[flink-akka.actor.default-dispatcher-18] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[main] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 67558 msecs
[main] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[main] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_36}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2593>)_21}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 3, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_28}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_34}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_22:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_36}: 26, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2593>)_4}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_24}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2593>)_4}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2593>)_4}: 98, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_25}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_25}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_36}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_28}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_27}: 33, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:0:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2593>)_4}: 98, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_28}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_23}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 3, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_23}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 1, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_34}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_22:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_35}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_25}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_24}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_24}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_24}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_27}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_27}: 33, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_25}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_23}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_23}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_35}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2593>)_21}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:0:0}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_35}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_35}: 0, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_34}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_28}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_22:0}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_22:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at <string>:20>)_17}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at <string>:20>)_17}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_19:1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_25}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19:1}: 3, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_27}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at <string>:20>)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2593>)_21}: 0, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_36}: 26, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19:0}: 1, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2593>)_21}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_29}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at <string>:20>)_17}: 0)Distributions(48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=66, count=3, min=22, max=22}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19:1}: DistributionResult{sum=60, count=3, min=20, max=20}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=48, count=3, min=16, max=16}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=45, count=3, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=46, count=1, min=46, max=46}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=29, count=1, min=29, max=29}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=21, count=1, min=21, max=21}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=42, count=3, min=14, max=14}, 14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19:0}: DistributionResult{sum=19, count=1, min=19, max=19}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24}: DistributionResult{sum=21, count=1, min=21, max=21}, 26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=15, count=1, min=15, max=15}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_25}: DistributionResult{sum=14, count=1, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=42, count=3, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=45, count=3, min=15, max=15}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=42, count=3, min=14, max=14}))
[main] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Job BeamApp-jenkins-0108030651-91e54179_9fd07523-ccfe-4705-a965-2ae7d7ff332a finished successfully.

rm -rf $ENV_DIR
rm -f $OUTPUT_JAR

>>> SUCCESS
if [[ "$TEST_EXIT_CODE" -eq 0 ]]; then
  echo ">>> SUCCESS"
else
  echo ">>> FAILURE"
fi
exit $TEST_EXIT_CODE

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/ws/src/runners/flink/job-server/flink_job_server.gradle'> line: 231

* What went wrong:
Execution failed for task ':runners:flink:1.9:job-server:testFlinkUberJarPy37'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/ws/src/runners/flink/job-server/flink_job_server.gradle'> line: 231

* What went wrong:
Execution failed for task ':runners:flink:1.9:job-server:testFlinkUberJarPy36'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 18m 1s
81 actionable tasks: 74 executed, 5 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bubzqhoyv4bxw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_PortableJar_Flink - Build # 1061 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink (build #1061)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1061/ to view the results.