You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/23 18:19:11 UTC

Build failed in Jenkins: beam_PostCommit_Py_ValCont #4882

See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/4882/display/redirect>

Changes:


------------------------------------------
[...truncated 594.49 KB...]
Downloading https://files.pythonhosted.org/packages/ad/e7/371d64fc6c6fce53c490fc38816a77fe8c75ff608edc0807170ff3c62bf2/timeloop-1.0.2-py2-none-any.whl#sha256=70cb69eeef39968ea0e6dd68a3a3a51257d5f4aa27b177ab1b19ca86ff525946
Best match: timeloop 1.0.2
Processing timeloop-1.0.2-py2-none-any.whl
Installing timeloop-1.0.2-py2-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/timeloop-1.0.2-py2.7.egg>
Searching for ipython<6,>=5.8.0
Reading https://pypi.org/simple/ipython/
Downloading https://files.pythonhosted.org/packages/b0/88/d996ab8be22cea1eaa18baee3678a11265e18cf09974728d683c51102148/ipython-5.8.0-py2-none-any.whl#sha256=37101b8cbe072fe17bff100bc03d096404e4a9a0357097aeb5b61677c042cab1
Best match: ipython 5.8.0
Processing ipython-5.8.0-py2-none-any.whl
Installing ipython-5.8.0-py2-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>
writing requirements to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/ipython-5.8.0-py2.7.egg/EGG-INFO/requires.txt>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/ipython-5.8.0-py2.7.egg>
Searching for facets-overview<2,>=1.0.0
Reading https://pypi.org/simple/facets-overview/
Downloading https://files.pythonhosted.org/packages/df/8a/0042de5450dbd9e7e0773de93fe84c999b5b078b1f60b4c19ac76b5dd889/facets_overview-1.0.0-py2.py3-none-any.whl#sha256=bda7e7b68ff68f5757af87b9cf8b76994ab15db08db2155e15e4eae58695f4ae
Best match: facets-overview 1.0.0
Processing facets_overview-1.0.0-py2.py3-none-any.whl
Installing facets_overview-1.0.0-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>
writing requirements to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/facets_overview-1.0.0-py2.7.egg/EGG-INFO/requires.txt>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/facets_overview-1.0.0-py2.7.egg>
Searching for traitlets>=4.2
Reading https://pypi.org/simple/traitlets/
Downloading https://files.pythonhosted.org/packages/ca/ab/872a23e29cec3cf2594af7e857f18b687ad21039c1f9b922fac5b9b142d5/traitlets-4.3.3-py2.py3-none-any.whl#sha256=70b4c6a1d9019d7b4f6846832288f86998aa3b9207c6821f3578a6a6a467fe44
Best match: traitlets 4.3.3
Processing traitlets-4.3.3-py2.py3-none-any.whl
Installing traitlets-4.3.3-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>
writing requirements to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/traitlets-4.3.3-py2.7.egg/EGG-INFO/requires.txt>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/traitlets-4.3.3-py2.7.egg>
Searching for simplegeneric>0.8
Reading https://pypi.org/simple/simplegeneric/
Downloading https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip#sha256=dc972e06094b9af5b855b3df4a646395e43d1c9d0d39ed345b7393560d0b9173
Best match: simplegeneric 0.8.1
Processing simplegeneric-0.8.1.zip
Writing /tmp/easy_install-MKMS44/simplegeneric-0.8.1/setup.cfg
Running simplegeneric-0.8.1/setup.py -q bdist_egg --dist-dir /tmp/easy_install-MKMS44/simplegeneric-0.8.1/egg-dist-tmp-h8ePUx
zip_safe flag not set; analyzing archive contents...
Moving simplegeneric-0.8.1-py2.7.egg to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/simplegeneric-0.8.1-py2.7.egg>
Searching for pygments
Reading https://pypi.org/simple/pygments/
Downloading https://files.pythonhosted.org/packages/5c/73/1dfa428150e3ccb0fa3e68db406e5be48698f2a979ccbcec795f28f44048/Pygments-2.4.2-py2.py3-none-any.whl#sha256=71e430bc85c88a430f000ac1d9b331d2407f681d6f6aec95e8bcfbc3df5b0127
Best match: Pygments 2.4.2
Processing Pygments-2.4.2-py2.py3-none-any.whl
Installing Pygments-2.4.2-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/Pygments-2.4.2-py2.7.egg>
Searching for prompt-toolkit<2.0.0,>=1.0.4
Reading https://pypi.org/simple/prompt-toolkit/
Downloading https://files.pythonhosted.org/packages/9d/d2/2f099b5cd62dab819ce7a9f1431c09a9032fbfbb6474f442722e88935376/prompt_toolkit-1.0.18-py2-none-any.whl#sha256=f7eec66105baf40eda9ab026cd8b2e251337eea8d111196695d82e0c5f0af852
Best match: prompt-toolkit 1.0.18
Processing prompt_toolkit-1.0.18-py2-none-any.whl
Installing prompt_toolkit-1.0.18-py2-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>
writing requirements to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/prompt_toolkit-1.0.18-py2.7.egg/EGG-INFO/requires.txt>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/prompt_toolkit-1.0.18-py2.7.egg>
Searching for pickleshare
Reading https://pypi.org/simple/pickleshare/
Downloading https://files.pythonhosted.org/packages/9a/41/220f49aaea88bc6fa6cba8d05ecf24676326156c23b991e80b3f2fc24c77/pickleshare-0.7.5-py2.py3-none-any.whl#sha256=9649af414d74d4df115d5d718f82acb59c9d418196b7b4290ed47a12ce62df56
Best match: pickleshare 0.7.5
Processing pickleshare-0.7.5-py2.py3-none-any.whl
Installing pickleshare-0.7.5-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>
writing requirements to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/pickleshare-0.7.5-py2.7.egg/EGG-INFO/requires.txt>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/pickleshare-0.7.5-py2.7.egg>
Searching for pexpect
Reading https://pypi.org/simple/pexpect/
Downloading https://files.pythonhosted.org/packages/0e/3e/377007e3f36ec42f1b84ec322ee12141a9e10d808312e5738f52f80a232c/pexpect-4.7.0-py2.py3-none-any.whl#sha256=2094eefdfcf37a1fdbfb9aa090862c1a4878e5c7e0e7e7088bdb511c558e5cd1
Best match: pexpect 4.7.0
Processing pexpect-4.7.0-py2.py3-none-any.whl
Installing pexpect-4.7.0-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>
writing requirements to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/pexpect-4.7.0-py2.7.egg/EGG-INFO/requires.txt>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/pexpect-4.7.0-py2.7.egg>
Searching for decorator
Reading https://pypi.org/simple/decorator/
Downloading https://files.pythonhosted.org/packages/8f/b7/f329cfdc75f3d28d12c65980e4469e2fa373f1953f5df6e370e84ea2e875/decorator-4.4.1-py2.py3-none-any.whl#sha256=5d19b92a3c8f7f101c8dd86afd86b0f061a8ce4540ab8cd401fa2542756bce6d
Best match: decorator 4.4.1
Processing decorator-4.4.1-py2.py3-none-any.whl
Installing decorator-4.4.1-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/decorator-4.4.1-py2.7.egg>
Searching for backports.shutil-get-terminal-size
Reading https://pypi.org/simple/backports.shutil-get-terminal-size/
Downloading https://files.pythonhosted.org/packages/7d/cd/1750d6c35fe86d35f8562091737907f234b78fdffab42b29c72b1dd861f4/backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl#sha256=0975ba55054c15e346944b38956a4c9cbee9009391e41b86c68990effb8c1f64
Best match: backports.shutil-get-terminal-size 1.0.0
Processing backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl
Installing backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/backports.shutil_get_terminal_size-1.0.0-py2.7.egg>
Searching for ipython-genutils
Reading https://pypi.org/simple/ipython-genutils/
Downloading https://files.pythonhosted.org/packages/fa/bc/9bd3b5c2b4774d5f33b2d544f1460be9df7df2fe42f352135381c347c69a/ipython_genutils-0.2.0-py2.py3-none-any.whl#sha256=72dd37233799e619666c9f639a9da83c34013a73e8bbc79a7a6348d93c61fab8
Best match: ipython-genutils 0.2.0
Processing ipython_genutils-0.2.0-py2.py3-none-any.whl
Installing ipython_genutils-0.2.0-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/ipython_genutils-0.2.0-py2.7.egg>
Searching for ptyprocess>=0.5
Reading https://pypi.org/simple/ptyprocess/
Downloading https://files.pythonhosted.org/packages/d1/29/605c2cc68a9992d18dada28206eeada56ea4bd07a239669da41674648b6f/ptyprocess-0.6.0-py2.py3-none-any.whl#sha256=d7cc528d76e76342423ca640335bd3633420dc1366f258cb31d05e865ef5ca1f
Best match: ptyprocess 0.6.0
Processing ptyprocess-0.6.0-py2.py3-none-any.whl
Installing ptyprocess-0.6.0-py2.py3-none-any.whl to <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/ptyprocess-0.6.0-py2.7.egg>
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.io.gcp.datastore.v1.datastoreio"
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
WARNING:apache_beam.runners.interactive.interactive_environment:Interactive Beam requires Python 3.5.3+.
WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
WARNING:apache_beam.runners.interactive.interactive_environment:You cannot use Interactive Beam features when you are not in an interactive environment such as a Jupyter notebook or ipython terminal.
WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:517: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  for spec in yaml.load_all(open(transcript_filename)):
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 56, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 44, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 180, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 66, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1442, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 136, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 286, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-340'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:330)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:93)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:220)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 136, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 286, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-340'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:249)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:297)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:738)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-23_10_13_53-7273715890606982172?project=apache-beam-testing

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: nosetests-python2.7_sdk.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 656.186s

FAILED (errors=1)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191123-180018
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:43a6b572e2739754550bf73b113d157281b63e2b6bd8a03201f6e8abe7424c91
Deleted: sha256:84bc36f9d09ee1567cb910c697d30f49959cfe18f4afebbb8753ef43d9603281
Deleted: sha256:6df90f4079ab4feac421dd028955f958c0388d1e441c6f4fd804c4285f4bca84
Deleted: sha256:9c494262b66522aef834667f2965e22035bbd36371c62132012654f5da2e8252
Deleted: sha256:3fb169fdc392d18f28c88737689e84aeece40a8b5d59c2b77bae0f6296a8342a
Deleted: sha256:535ce23d5dc8524e7d58c6316d608cead6aa409fd7ffa7616ba097fac88073c1
Deleted: sha256:fec28987ef8cfeac5143063ce8939c65c9159612e0034bfad37b71ab4f2f2129
Deleted: sha256:29d32c668908bfff251ae004a3d7bf80acd9f5c93fdc8ff03c8356d736591768
Deleted: sha256:e84deb07f8a4ac4a216fbc750d3582b75c3adb750524f9821a4d0ad92776787f
Deleted: sha256:cb0297aaee0ead53cfd01cecadff4d6b1bce021595295297f700491dc30f3ff5
Deleted: sha256:060ec94d36b945d3e39ea63f86a6cb5dd82bb651b465e28fef9e64fae8e7bbc2
Deleted: sha256:4a5fd1790c5c24ea30fdecb04bf2399bdbe673ed2cd9f9cd74876d5fc335b23b
Deleted: sha256:52e7af9837ba9792a1c81f15412d81b6cad8ee3a54f57c1dbd7ae36973c68b6c
Deleted: sha256:dd1dbffa3f3f6ff9c6bab0f459817d8f6ae0603aa4ec42a854d396add84faebe
Deleted: sha256:b549da363836f35575c2f0b3a4bd9b6c1c3253f80d7c09f4f608bb4f7d285605
Deleted: sha256:c02f23bf9b9aa1cf6f2b3fdc8d57a9da56b7ff7ca6875f6301c8d1f8f39e2d7e
Deleted: sha256:d5040f5a044eef98d6406a85af956cce7ec0059e2b17662854a7804174f6dc7d
Deleted: sha256:dcf6756e2dd3ffeefc4790f6107e0e6c2596bf4d937f8193f3b8883fc3cddafd
Deleted: sha256:097a35cf5977b1926f9ed899722f7c32cde7fa21029df2d57ded4dc498fffd01
Deleted: sha256:4e7dff9343a711d35fbd1d58e0eb05b101340e7454262da7f8b72328494b0bf3
Deleted: sha256:cd89945ce9fb58a86c15c726cea38e5b6a1fd4da982c7a212c7e7503416d8419
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:43a6b572e2739754550bf73b113d157281b63e2b6bd8a03201f6e8abe7424c91
  Associated tags:
 - 20191123-180018
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191123-180018
Deleted [us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191123-180018].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:43a6b572e2739754550bf73b113d157281b63e2b6bd8a03201f6e8abe7424c91].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_ValCont #4884

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/4884/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #4883

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/4883/display/redirect>

Changes:


------------------------------------------
[...truncated 594.05 KB...]
setup.py:198: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.18.0.dev' to '2.18.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:58: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger "apache_beam.io.gcp.datastore.v1.datastoreio"
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
WARNING:apache_beam.runners.interactive.interactive_environment:Interactive Beam requires Python 3.5.3+.
WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
WARNING:apache_beam.runners.interactive.interactive_environment:You cannot use Interactive Beam features when you are not in an interactive environment such as a Jupyter notebook or ipython terminal.
WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:517: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  for spec in yaml.load_all(open(transcript_filename)):
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 85, in _run_wordcount_it
    save_main_session=False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 117, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 66, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1442, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -707: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 136, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 286, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-681'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:330)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:93)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:220)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -707: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 136, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 286, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-681'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:249)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:297)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:738)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-23_16_08_33-10724413677870097587?project=apache-beam-testing

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 56, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 44, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 180, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 66, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1442, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -402: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 136, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 286, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-376'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:330)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:93)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:220)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -402: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 136, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 286, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-376'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:249)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:297)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:738)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-23_16_13_57-1302645629581801063?project=apache-beam-testing

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: nosetests-python2.7_sdk.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 629.340s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191124-000116
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:45ea26af0c0459c8a3e48d101ce22ba27888f9dee014f872777f23019bb94fb2
Deleted: sha256:1e291bdb07f7870a2d3aee948ff4ef1d01310734b71ef211fb4babb52529a5d4
Deleted: sha256:a819c82c4b7bfbe48f1a0f43debe544c28010fae1904aa3697181cb03d8b63fd
Deleted: sha256:35a952d1f667684d274ae0dd7b93a0d15877338c06cd2f3c844934ff7df331ef
Deleted: sha256:dad57d222715273e642ed6e02a579f989a6b67c81a5f09a6baca615c3aab9bf9
Deleted: sha256:abfdb3dbc3a54957287898346ab251883d6d9063fb375b811458a2703134e96c
Deleted: sha256:071147b5b75e5ddc7dd639530627a447667e7557155255980c4c779b58363af1
Deleted: sha256:3d47d05fbf2977716f3bfc6654ecea922035652446a1a5ce2196ae1359cf9d84
Deleted: sha256:8db8fd1da147bcb0df409fb1c13bd51ae8e9c1914b6522bc0b934018f527e507
Deleted: sha256:638d070c25d476f14156bb481b0eafb17da3e63b2d5579e395676a2f10a6b4d3
Deleted: sha256:028a619158061ff6dfb716adedc3306e86e25a6aa2ae3956419d74ab3abb17f3
Deleted: sha256:a2e176f99670492d6ee60ac3354b027cba266f39cc097420daea4cb750bc9063
Deleted: sha256:cd3053f3c18ea19ec0c8a8e85190c33a094004976e7e91fc4026c4cb9a70386d
Deleted: sha256:9f3f2ddf92a578a4f2905c52ab6d3397078a850e0e5b8ca24388ad98a68a53cc
Deleted: sha256:dfdb81887df5814815548fcd1311d47d1d0e02adf3438040dfd21a6ffaeeac0c
Deleted: sha256:13a7268d518d04a04a449293610efc0e151fd895bcc68ad09564834b46a3dc29
Deleted: sha256:fc2f40101412ebfb9ad258206a83f1fe35650fe1a64b0cd9188c7b3d11612056
Deleted: sha256:629570f8bc722a0c423e66b4557fe3036587751f660499075ad4b19ec207906f
Deleted: sha256:30e028baaa3fc5038758a0c53f578b000b6452596f9ce541aac2707bcdaaae4d
Deleted: sha256:5bdd745d59bdfb74c288fe72c16827d209bcd5f0da228dd336437d55ed653872
Deleted: sha256:dc8c3e3c0fabfc32925be9f29e81a708e0097076a380e748b5fc31c04ddc6ed5
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:45ea26af0c0459c8a3e48d101ce22ba27888f9dee014f872777f23019bb94fb2
  Associated tags:
 - 20191124-000116
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191124-000116
Deleted [us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191124-000116].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:45ea26af0c0459c8a3e48d101ce22ba27888f9dee014f872777f23019bb94fb2].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org