You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/16 18:23:21 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Flink #385

See <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/385/display/redirect>

------------------------------------------
[...truncated 251.49 KB...]
See 'docker run --help'.
ERROR
test_write (apache_beam.io.external.xlang_parquetio_test.XlangParquetIOTest) ... Unable to find image 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
docker: Error response from daemon: unknown: Repo 'apache' was not found.
See 'docker run --help'.
ERROR
test_java_expansion_dataflow (apache_beam.transforms.external_test.ExternalTransformTest) ... SKIP: No expansion service jar or port provided.
test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest) ... Unable to find image 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
docker: Error response from daemon: unknown: Repo 'apache' was not found.
See 'docker run --help'.
ERROR
test_multi (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_nested (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_payload (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_pipeline_generation (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_simple (apache_beam.transforms.external_test.ExternalTransformTest) ... ok

======================================================================
ERROR: test_generate_sequence (apache_beam.io.external.generate_sequence_test.XlangGenerateSequenceTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/io/external/generate_sequence_test.py",> line 59, in test_generate_sequence
    raise e
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: WARNING: Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
root: WARNING: python-snappy is not installed; some tests will be skipped.
root: WARNING: Tensorflow is not installed, so skipping some tests.
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7efc1a41f0c8> ====================
root: DEBUG: 23 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['external_1root/Read(BoundedCountingSource)/Impulse\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_19\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_20\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_21\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7efc1a41f140> ====================
root: DEBUG: 23 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['external_1root/Read(BoundedCountingSource)/Impulse\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_19\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_20\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_21\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '34187', '--artifact-port', '49969', '--expansion-port', '40129']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:34187.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:34187.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:34187.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '34187', '--artifact-port', '49969', '--expansion-port', '40129']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_write (apache_beam.io.external.xlang_parquetio_test.XlangParquetIOTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/io/external/xlang_parquetio_test.py",> line 69, in test_write
    raise e
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7efc1a41f0c8> ====================
root: DEBUG: 40 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7efc1a41f140> ====================
root: DEBUG: 40 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '40793', '--artifact-port', '40915', '--expansion-port', '55221']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:40793.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:40793.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '40793', '--artifact-port', '40915', '--expansion-port', '55221']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 314, in test_java_expansion_portable_runner
    ExternalTransformTest.run_pipeline_with_portable_runner(None)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 347, in run_pipeline_with_portable_runner
    pipeline_options, ExternalTransformTest.expansion_service_port, True)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 375, in run_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 243, in run_pipeline
    job_service = self.create_job_service(options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 161, in create_job_service
    return server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 84, in start
    self._endpoint = self._job_server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7efc1a41f0c8> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3root/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4root/Init/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7efc1a41f140> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3root/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4root/Init/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '39167', '--artifact-port', '43883', '--expansion-port', '48267']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:39167.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:39167.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '39167', '--artifact-port', '43883', '--expansion-port', '48267']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 8.124s

FAILED (SKIP=1, errors=3)

> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingJava FAILED

> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingPython
setup.py:185: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
test_generate_sequence (apache_beam.io.external.generate_sequence_test.XlangGenerateSequenceTest) ... ok
test_write (apache_beam.io.external.xlang_parquetio_test.XlangParquetIOTest) ... ok
test_java_expansion_dataflow (apache_beam.transforms.external_test.ExternalTransformTest) ... SKIP: No expansion service jar or port provided.
test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest) ... Unable to find image 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
docker: Error response from daemon: unknown: Repo 'apache' was not found.
See 'docker run --help'.
ERROR
test_multi (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_nested (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_payload (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_pipeline_generation (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_simple (apache_beam.transforms.external_test.ExternalTransformTest) ... ok

======================================================================
ERROR: test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 314, in test_java_expansion_portable_runner
    ExternalTransformTest.run_pipeline_with_portable_runner(None)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 347, in run_pipeline_with_portable_runner
    pipeline_options, ExternalTransformTest.expansion_service_port, True)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 375, in run_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 243, in run_pipeline
    job_service = self.create_job_service(options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 161, in create_job_service
    return server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 84, in start
    self._endpoint = self._job_server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7f0ef216ce60> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/Filter(<lambda at expansion_service_test.py:62>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/PerElement:PairWithVoid:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7f0ef216ced8> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/Filter(<lambda at expansion_service_test.py:62>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/PerElement:PairWithVoid:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '51729', '--artifact-port', '44987', '--expansion-port', '33953']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:51729.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:51729.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '51729', '--artifact-port', '44987', '--expansion-port', '33953']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 8.255s

FAILED (SKIP=1, errors=1)

> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 55s
125 actionable tasks: 94 executed, 25 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/mdtgbm4mdv4eo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Flink #388

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/388/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Flink #387

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/387/display/redirect>

------------------------------------------
[...truncated 4.36 MB...]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (14/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (14/16) (bfe0587fe6a9c92d986e17d3022690c3) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (14/16).
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) [DEPLOYING]
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) [DEPLOYING].
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) [DEPLOYING].
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9).
[DataSink (DiscardingOutput) (14/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) cbbaad2e7ed6b7c55bf262f74bf35cb9.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (14/16) (cbbaad2e7ed6b7c55bf262f74bf35cb9) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (13/16) (091abec29426ea5580631f3e6aee8bd0) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (13/16) (091abec29426ea5580631f3e6aee8bd0).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (13/16) (091abec29426ea5580631f3e6aee8bd0) [FINISHED]
[jobmanager-future-thread-11] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 091abec29426ea5580631f3e6aee8bd0.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (13/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (13/16).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (13/16) (091abec29426ea5580631f3e6aee8bd0) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) [DEPLOYING]
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) [DEPLOYING].
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) [DEPLOYING].
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd).
[DataSink (DiscardingOutput) (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 5bc382cf38c2652cb79c4c1b59d3e0bd.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (13/16) (5bc382cf38c2652cb79c4c1b59d3e0bd) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16) (5fdef0daebbf45c9e2485a8e0f61b949) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16) (5fdef0daebbf45c9e2485a8e0f61b949).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16) (5fdef0daebbf45c9e2485a8e0f61b949) [FINISHED]
[jobmanager-future-thread-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 5fdef0daebbf45c9e2485a8e0f61b949.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (16/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16) (5fdef0daebbf45c9e2485a8e0f61b949) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (16/16).
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) [DEPLOYING]
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) [DEPLOYING].
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) [DEPLOYING].
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895).
[DataSink (DiscardingOutput) (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 0f78e84b66de40b3a18111896e2ac895.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (16/16) (0f78e84b66de40b3a18111896e2ac895) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/16) (31e71a3bb8c37bb66bc617f1dd31fe36) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/16) (31e71a3bb8c37bb66bc617f1dd31fe36).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/16) (31e71a3bb8c37bb66bc617f1dd31fe36) [FINISHED]
[jobmanager-future-thread-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 31e71a3bb8c37bb66bc617f1dd31fe36.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (1/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/16) (31e71a3bb8c37bb66bc617f1dd31fe36) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (1/16).
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) [DEPLOYING]
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) [DEPLOYING].
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) [DEPLOYING].
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347).
[DataSink (DiscardingOutput) (1/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 0d50dd24f5e962504dbdf057d1a38347.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/16) (0d50dd24f5e962504dbdf057d1a38347) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (9/16) (3c126e206f480002e52d3da037752f52) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (9/16) (3c126e206f480002e52d3da037752f52).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (9/16) (3c126e206f480002e52d3da037752f52) [FINISHED]
[jobmanager-future-thread-12] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 3c126e206f480002e52d3da037752f52.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (9/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (9/16) (3c126e206f480002e52d3da037752f52) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (9/16).
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) [DEPLOYING]
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) [DEPLOYING].
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) [DEPLOYING].
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4).
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) [FINISHED]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) fc0d08a14b3865a331824f88a8a555f4.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (fc0d08a14b3865a331824f88a8a555f4) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (12/16) (ff5c57b8c0132f02b36150c4f32a227c) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (12/16) (ff5c57b8c0132f02b36150c4f32a227c).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (12/16) (ff5c57b8c0132f02b36150c4f32a227c) [FINISHED]
[jobmanager-future-thread-13] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) ff5c57b8c0132f02b36150c4f32a227c.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (12/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (12/16) (ff5c57b8c0132f02b36150c4f32a227c) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (12/16).
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) [DEPLOYING]
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) [DEPLOYING].
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) [DEPLOYING].
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293).
[DataSink (DiscardingOutput) (12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 10db2be33db7ee289cd282d3157e8293.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (12/16) (10db2be33db7ee289cd282d3157e8293) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (3/16) (7f55dbaf7501f82230ea647639ed7cb9) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (3/16) (7f55dbaf7501f82230ea647639ed7cb9).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (3/16) (7f55dbaf7501f82230ea647639ed7cb9) [FINISHED]
[jobmanager-future-thread-14] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 7f55dbaf7501f82230ea647639ed7cb9.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (3/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (3/16).
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (3/16) (7f55dbaf7501f82230ea647639ed7cb9) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) [DEPLOYING]
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) [DEPLOYING].
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) [DEPLOYING].
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2).
[DataSink (DiscardingOutput) (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 62d007bc1bb335f36124af19e29c6af2.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (3/16) (62d007bc1bb335f36124af19e29c6af2) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16) (9e6dc134b768cfda22e0bd8954d2e4ec) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16) (9e6dc134b768cfda22e0bd8954d2e4ec).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (11/16) (388654f5ff8127fc5349be655c22951f) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (11/16) (388654f5ff8127fc5349be655c22951f).
[jobmanager-future-thread-10] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16) (9e6dc134b768cfda22e0bd8954d2e4ec) [FINISHED]
[jobmanager-future-thread-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (11/16) (388654f5ff8127fc5349be655c22951f) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 9e6dc134b768cfda22e0bd8954d2e4ec.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (5/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (11/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) 388654f5ff8127fc5349be655c22951f.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16) (9e6dc134b768cfda22e0bd8954d2e4ec) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (5/16).
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) [DEPLOYING]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (11/16) (388654f5ff8127fc5349be655c22951f) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) [DEPLOYING].
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (11/16).
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) [DEPLOYING]
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) [DEPLOYING].
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) [DEPLOYING].
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) [DEPLOYING].
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5).
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7).
[DataSink (DiscardingOutput) (11/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) [FINISHED]
[DataSink (DiscardingOutput) (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) [FINISHED]
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 2783cbd9e031c907d3b67517f80947b5.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) e1ce14abb7448f8b0c7e101790dfbbd7.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (11/16) (2783cbd9e031c907d3b67517f80947b5) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (5/16) (e1ce14abb7448f8b0c7e101790dfbbd7) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-root-0917002243-1df425c6 (8454a727f41c787c8451de450c6eeda4) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 8454a727f41c787c8451de450c6eeda4 reached globally terminal state FINISHED.
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-root-0917002243-1df425c6(8454a727f41c787c8451de450c6eeda4).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 136f162b718c640717e6677efd177c15: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher9f9b35fc-9025-4089-9f9b-ba74b6593798.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher9f9b35fc-9025-4089-9f9b-ba74b6593798.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 8454a727f41c787c8451de450c6eeda4 with leader id a3d3db27ba2ef8d7028b9e85837b4ff8 lost leadership.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager a3d3db27ba2ef8d7028b9e85837b4ff8@akka://flink/user/jobmanager_1 for job 8454a727f41c787c8451de450c6eeda4 from the resource manager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed spill file directory /tmp/flink-io-d6aacbfe-dd4b-4fa6-bd6f-4b95ee9103f9
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-9] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher9f9b35fc-9025-4089-9f9b-ba74b6593798.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending the SlotManager.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:33905
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 37300 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4}: 81, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 257, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 44, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine}: 0, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 569, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 2, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 6, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 2, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 2, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 2, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=external_4_PCollection_PCollection_1}: 6, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 2, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 257, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4}: 81, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 190, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 569, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 44, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 183, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 7, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 2, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, pcollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0)Distributions(ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=external_4_PCollection_PCollection_1}: DistributionResult{sum=90, count=6, min=15, max=15}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=84, count=6, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=pcollection}: DistributionResult{sum=96, count=6, min=16, max=16}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=pcollection_1}: DistributionResult{sum=72, count=3, min=21, max=27}, pcollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=pcollection_2}: DistributionResult{sum=48, count=3, min=16, max=16}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Manifest at /tmp/beam-artifact-staging/job_0d9e3e33-783b-48dd-9308-65c96edfd288/MANIFEST has 0 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-artifact-staging/job_0d9e3e33-783b-48dd-9308-65c96edfd288/
ok
test_multi (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_nested (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_payload (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_pipeline_generation (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_simple (apache_beam.transforms.external_test.ExternalTransformTest) ... ok

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 86.741s

OK (SKIP=1)

> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerCleanup

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 58s
125 actionable tasks: 94 executed, 25 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/iwbau5xrpylf6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Flink #386

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/386/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8034] Upgrade Flink Runner to 1.8.2

------------------------------------------
[...truncated 251.58 KB...]
ERROR
test_write (apache_beam.io.external.xlang_parquetio_test.XlangParquetIOTest) ... Unable to find image 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
docker: Error response from daemon: unknown: Repo 'apache' was not found.
See 'docker run --help'.
ERROR
test_java_expansion_dataflow (apache_beam.transforms.external_test.ExternalTransformTest) ... SKIP: No expansion service jar or port provided.
test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest) ... Unable to find image 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
docker: Error response from daemon: unknown: Repo 'apache' was not found.
See 'docker run --help'.
ERROR
test_multi (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_nested (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_payload (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_pipeline_generation (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_simple (apache_beam.transforms.external_test.ExternalTransformTest) ... ok

======================================================================
ERROR: test_generate_sequence (apache_beam.io.external.generate_sequence_test.XlangGenerateSequenceTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/io/external/generate_sequence_test.py",> line 59, in test_generate_sequence
    raise e
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: WARNING: Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
root: WARNING: python-snappy is not installed; some tests will be skipped.
root: WARNING: Tensorflow is not installed, so skipping some tests.
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7f98d471f0c8> ====================
root: DEBUG: 23 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['external_1root/Read(BoundedCountingSource)/Impulse\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_19\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_20\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_21\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7f98d471f140> ====================
root: DEBUG: 23 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['external_1root/Read(BoundedCountingSource)/Impulse\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(SplitBoundedSource)/ParMultiDo(SplitBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1root/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource)\n  GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(ReadFromBoundedSource)/ParMultiDo(ReadFromBoundedSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_19\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_20\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_21\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '56893', '--artifact-port', '40379', '--expansion-port', '40815']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:56893.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:56893.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:56893.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '56893', '--artifact-port', '40379', '--expansion-port', '40815']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_write (apache_beam.io.external.xlang_parquetio_test.XlangParquetIOTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/io/external/xlang_parquetio_test.py",> line 69, in test_write
    raise e
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7f98d471f0c8> ====================
root: DEBUG: 40 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7f98d471f140> ====================
root: DEBUG: 40 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Gather bundles/ParMultiDo(GatherBundlesPerWindow):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/FileIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_2root/Values/Values/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:parquet_write)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '60743', '--artifact-port', '55163', '--expansion-port', '55353']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:60743.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:60743.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:60743.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '60743', '--artifact-port', '55163', '--expansion-port', '55353']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 314, in test_java_expansion_portable_runner
    ExternalTransformTest.run_pipeline_with_portable_runner(None)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 347, in run_pipeline_with_portable_runner
    pipeline_options, ExternalTransformTest.expansion_service_port, True)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 375, in run_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 243, in run_pipeline
    job_service = self.create_job_service(options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 161, in create_job_service
    return server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 84, in start
    self._endpoint = self._job_server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7f98d471f0c8> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3root/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4root/Init/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7f98d471f140> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3root/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4root/Init/Map/ParMultiDo(Anonymous)\n  ExternalTransform(beam:transforms:xlang:count)/Init/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '54749', '--artifact-port', '35013', '--expansion-port', '52427']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:54749.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:54749.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '54749', '--artifact-port', '35013', '--expansion-port', '52427']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 9.042s

FAILED (SKIP=1, errors=3)

> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingJava FAILED

> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingPython
setup.py:185: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
test_generate_sequence (apache_beam.io.external.generate_sequence_test.XlangGenerateSequenceTest) ... ok
test_write (apache_beam.io.external.xlang_parquetio_test.XlangParquetIOTest) ... ok
test_java_expansion_dataflow (apache_beam.transforms.external_test.ExternalTransformTest) ... SKIP: No expansion service jar or port provided.
test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest) ... Unable to find image 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
docker: Error response from daemon: unknown: Repo 'apache' was not found.
See 'docker run --help'.
ERROR
test_multi (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_nested (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_payload (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_pipeline_generation (apache_beam.transforms.external_test.ExternalTransformTest) ... ok
test_simple (apache_beam.transforms.external_test.ExternalTransformTest) ... ok

======================================================================
ERROR: test_java_expansion_portable_runner (apache_beam.transforms.external_test.ExternalTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 314, in test_java_expansion_portable_runner
    ExternalTransformTest.run_pipeline_with_portable_runner(None)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 347, in run_pipeline_with_portable_runner
    pipeline_options, ExternalTransformTest.expansion_service_port, True)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/external_test.py",> line 375, in run_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 243, in run_pipeline
    job_service = self.create_job_service(options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 161, in create_job_service
    return server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 84, in start
    self._endpoint = self._job_server.start()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: INFO: ==================== <function lift_combiners at 0x7ff7837ace60> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/Filter(<lambda at expansion_service_test.py:62>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/PerElement:PairWithVoid:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7ff7837aced8> ====================
root: DEBUG: 30 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4\n  Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(unicode)_17\n  Map(unicode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18\n  Map(<lambda at external_test.py:366>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_3_AppliedPTransform_root/Filter(<lambda at expansion_service_test.py:62>)_3\n  ExternalTransform(beam:transforms:xlang:filter_less_than_eq)/Filter(<lambda at expansion_service_test.py:62>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_4_AppliedPTransform_root/PerElement/PerElement:PairWithVoid_4\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/PerElement:PairWithVoid:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs\n  ExternalTransform(beam:transforms:xlang:count)/PerElement/CombinePerKey(CountCombineFn)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21\n  Map(<lambda at external_test.py:370>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22\n  Map(<lambda at external_test.py:371>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_25\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26\n  assert_that/Create/FlatMap(<lambda at core.py:2466>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_28\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_30\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_32\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_33\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_34\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_35\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_40\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_41\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '36473', '--artifact-port', '43537', '--expansion-port', '39503']
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:36473.
root: DEBUG: Waiting for jobs grpc channel to be ready at localhost:36473.
root: ERROR: Starting job service with ['docker', 'run', '-v', u'/usr/bin/docker:/bin/docker', '-v', '/var/run/docker.sock:/var/run/docker.sock', '--network=host', 'jenkins-docker-apache.bintray.io/beam/flink-job-server:latest', '--job-host', 'localhost', '--job-port', '36473', '--artifact-port', '43537', '--expansion-port', '39503']
root: ERROR: Error bringing up job service
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 124, in start
    self._process.poll())
RuntimeError: Job service failed to start up with error 125
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 6.995s

FAILED (SKIP=1, errors=1)

> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:flink:1.5:job-server:validatesCrossLanguageRunnerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.5:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 36s
125 actionable tasks: 94 executed, 25 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/oanutx53plwum

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org