You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/28 23:07:59 UTC

Build failed in Jenkins: beam_PostCommit_Py_ValCont #3679

See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3679/display/redirect>

------------------------------------------
Started by GitHub push by iemejia
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont
FATAL: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:52486
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.Request.call(Request.java:202)
		at hudson.remoting.Channel.call(Channel.java:954)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
Caused: hudson.remoting.RequestAbortedException
	at hudson.remoting.Request.abort(Request.java:340)
	at hudson.remoting.Channel.terminate(Channel.java:1038)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
	at hudson.remoting.Channel.close(Channel.java:1450)
	at hudson.remoting.Channel.close(Channel.java:1403)
	at jenkins.slaves.DefaultJnlpSlaveReceiver.afterChannel(DefaultJnlpSlaveReceiver.java:173)
	at org.jenkinsci.remoting.engine.JnlpConnectionState$4.invoke(JnlpConnectionState.java:421)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fire(JnlpConnectionState.java:312)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fireAfterChannel(JnlpConnectionState.java:418)
	at org.jenkinsci.remoting.engine.JnlpProtocol4Handler$Handler$1.run(JnlpProtocol4Handler.java:334)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR: apache-beam-jenkins-15 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_ValCont #3720

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3720/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3719/display/redirect>

------------------------------------------
[...truncated 208.25 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_08_43-3863655898976076169?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_16_11-10906469653674380520?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 815.020s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-180011
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:7804478c37ec3682df3395abf43ca0fcdcbbab78f2feb615de1a20a3d667e019
Deleted: sha256:cb253f66a63544d9392d17636cd262a7fdccb4cb2eff58df2f7eb4017103ef7c
Deleted: sha256:4973a15bb50f72d3aec3df7fdcebdb7b3dd385e8672faa0ad29e8474f5936929
Deleted: sha256:c26a20aa35ab85d43a986ff9986ce40f8a29465cd312569a32cb3570960a60fb
Deleted: sha256:348e925365134bad5d6ad48558c070f12ce8c658e71dc226113f83793cff33a2
Deleted: sha256:350d221d21cc8e909aac90195353036d2395516f655df5f428eb959264d704c5
Deleted: sha256:40569dad9f1ab24247992ea25f4ff6782b7f141c844307fe0df75b5e8fdefea8
Deleted: sha256:29afd2c6efb5835ffd92206c06dcef3859814c4e2b79d9ed46796cbd610e2de1
Deleted: sha256:5fc3f1a3e6c0aee7e160b6523c6ce502e6c81f862bb7b0298efbc5082f3ee6b8
Deleted: sha256:291858aedc63bf7d47fe698693fa38c1251b55935b091443b817b2924e979ef6
Deleted: sha256:73457211c004f14c2f937b67cb9f547bc3b85d83acd3baa26d3b77799652a398
Deleted: sha256:7e13e36844c83629026b086fc959884110381367e3e974d72287a191a36b3664
Deleted: sha256:92850e9324724d46ce0663a81620112e66dd82804c908e148cf0420606f5514c
Deleted: sha256:2e9b34d761c0882b3a58ee9a8465616709e7996b78a723f3a58b31a874bf26cb
Deleted: sha256:841941b37ffc941b8eaf3f3ef75cad1cf3c430772d3ce6ce31a7ca119cced081
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:7804478c37ec3682df3395abf43ca0fcdcbbab78f2feb615de1a20a3d667e019
  Associated tags:
 - 20190704-180011
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-180011
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-180011].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:7804478c37ec3682df3395abf43ca0fcdcbbab78f2feb615de1a20a3d667e019].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3718/display/redirect>

------------------------------------------
[...truncated 207.31 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_05_23_58-11063630514579930569?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_05_30_46-921014861769287707?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 744.166s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-121650
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8aec62bae3a1d6e6f0fb9fc06a413134ec6d508acc7719398a5c87b7927782f
Deleted: sha256:cab1138f70b3910e8f4bd1d151d0bc9c6117997c37be6e180af06e300f0cf2ee
Deleted: sha256:b8bb563229e2a24da8c04a096cb6635e0de7bc7404eb21be376b78fe65007a7f
Deleted: sha256:a291ecc029f4cecd6d448f33323e430ef2150e51d9dd38b1528dfdde55f95325
Deleted: sha256:0ad3f0a6bf131edac1cac34e9c61121e7981cb435113213bf1107b44d58cb4b7
Deleted: sha256:1d0d2b34f8533459faccadbf0eaf93c62b57812cf4943f815622e4f0c683bcd6
Deleted: sha256:c9617d316354c5027afb111779d4f601ae33f721c9b9f2b86aba4f0900353407
Deleted: sha256:8714433aed9ec9ea78833c99bd392f53db9c85185cb40a1b93afc7797fcca031
Deleted: sha256:a670dcab1dac24128fa469915f54162bb4c45742564298dcdf9fbb2a9044906d
Deleted: sha256:0ec949e0824a7ac58360027a394a430ba4e617f371fa24d448f525d6d0695d46
Deleted: sha256:64f9506411d2c8df71987cdccfd0d2fa2585fffa0e9c670016d71c2c2b7da987
Deleted: sha256:f9de546fbc2dffc65a7c49d88d6c34b48fb22bdc70b84f8e9d7fbc2013042ddd
Deleted: sha256:6fc62be8e6052b03560ac27f1ed61430ac99f68b4991a625c446b029ace72f19
Deleted: sha256:477ecc3178681f79ad9120e770189f8d321dcdf957c0fa2e3fb51e99c6f65064
Deleted: sha256:bf8aadb3bd94b5a5b26bbaed13b3e10bf34d61c88786576a3059b8914be73a42
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8aec62bae3a1d6e6f0fb9fc06a413134ec6d508acc7719398a5c87b7927782f
  Associated tags:
 - 20190704-121650
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-121650
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-121650].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8aec62bae3a1d6e6f0fb9fc06a413134ec6d508acc7719398a5c87b7927782f].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3717/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7550] Reimplement Python ParDo load test according to the proposal

------------------------------------------
[...truncated 207.63 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_04_54_22-9572408781194139436?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_05_00_20-11181944445268906857?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 774.258s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-114716
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:e1d8bce55776a09ce3de168e03530296b5f19cb6bba63751374c5f3adac71e58
Deleted: sha256:df84d9ad9ae0f9f574a68999e910afecd7770729b3b0ec48d4064e2ced5efa6c
Deleted: sha256:379a03148681ff8bcb92b7f6ad316a998ad0b5e0d5d97651c3e19d3dad2e90a7
Deleted: sha256:ee2d60bcb3c7380c0d4c6c48c70171af27be7c47ba23081822615eb81008ba98
Deleted: sha256:3cbfab870bca8eb976c646f4223c3b7655e350d88002201ad7c92f850e0054b1
Deleted: sha256:f03f9a570e245f4c4cdba45fd90d04af975508b1e5c0c00bfdf84257c1503e7b
Deleted: sha256:c0f0c6b7b4c823c4f4c068ef28348516bf426631497a428a772c23e38c1a2b7d
Deleted: sha256:e7574635e882f824b5051f10b0c1223dd4702ec281529c7c70f0cb6262614bb4
Deleted: sha256:856aa7c9c6327acd18179582e49ee1effdcb12e6f6600077f3fc82112c60c4d3
Deleted: sha256:0949cc5e28f2d7ece6a4b2d27ee0ce058433c4441eda56aa4e4d600a94e28d96
Deleted: sha256:8b2ab1a71caed205f438ba668adfa0c5ef580b18316aa6d75edf5b503ad7b686
Deleted: sha256:2eac3158bfbd480348969ef811db34feb72f205d26509c0b5b5a50b40586abb7
Deleted: sha256:a1116207e976fe1539a17e580063fa4b8c4e0c3282282900fbf7bbb30805f8ab
Deleted: sha256:9acb7472955e802cad51051c4f8860940df460158427781f1c992d517dfca9b7
Deleted: sha256:a90774bce104f2ef2d270709181e56d5bc44c4487bb43208ee299f44a8f7a88c
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:e1d8bce55776a09ce3de168e03530296b5f19cb6bba63751374c5f3adac71e58
  Associated tags:
 - 20190704-114716
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-114716
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-114716].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:e1d8bce55776a09ce3de168e03530296b5f19cb6bba63751374c5f3adac71e58].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3716/display/redirect?page=changes>

Changes:

[jozsi] Add Jet Runner to the Get Started page

------------------------------------------
[...truncated 207.71 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_04_32_26-16586542757698003753?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_04_38_39-155296506188003562?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 759.651s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-112420
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:f46841ccbed356eed1b5629b4114bb63a09f19085ef3e28560704056c14a00dd
Deleted: sha256:68589fe63ebb7215e3d4fceb05bda8c16436b4a169453789900da321bd4cc34a
Deleted: sha256:7520f0beb6eb99ec5403a8803f18b1ab87f565a618ef6c89eb555718a6e4f1ef
Deleted: sha256:bdc09b0a757163269c135e8265aedc0ac38ffba45574c65b51d399279bc96c3c
Deleted: sha256:0ca37b89f78d90d36b6945a27ebc8e43bd8f545467274f78a38487074d743fd5
Deleted: sha256:50eab9d256fb7129ee73d7f4e893115783c8b0deeb34502028469075b0cfcbe7
Deleted: sha256:ffc0a8fc3b1155d3f93f6446dbc1e9035ce490bc88c92f6b540531c43ab588a1
Deleted: sha256:38ccc6e7fc51654cea4b6c5146c0bc1bc2089046fa10c699a52d1d5f072574b5
Deleted: sha256:de3dab31d88940d8491a5d4cb8dcb9e0119312453b501164cea70cda045f9f5e
Deleted: sha256:ee524c79b73895db8d87a28aa5cf61a24b3858f4b62ec79b25e3d07802ce7ecf
Deleted: sha256:a0a4a702266ec73e019794e568e5f6992372a5008c07c3ee3eefc56566865e8f
Deleted: sha256:9df58af95e973da158f7e35f516e24986fa108e2550063f45eb2acfbde3380f0
Deleted: sha256:2bfbb153f35165d6d8d5821226b7c57fbe26fd3d06e088bb21eac2b67c3dd987
Deleted: sha256:1335be1fb21693f92342fc461774c573c4666472e316b68f87d3b980011afcae
Deleted: sha256:11f42f73ed8c51c3bfc84137aadf04e8c1ce8e23937a9c063975bae98ed6692e
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:f46841ccbed356eed1b5629b4114bb63a09f19085ef3e28560704056c14a00dd
  Associated tags:
 - 20190704-112420
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-112420
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-112420].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:f46841ccbed356eed1b5629b4114bb63a09f19085ef3e28560704056c14a00dd].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3715/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7682] Fix Combine.GroupedValues javadoc code snippet

------------------------------------------
[...truncated 207.71 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -286: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-264\x12\x04-262'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -286: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-264\x12\x04-262'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_49_55-10794539259424561296?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_56_23-2387914916096189018?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 784.492s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-094238
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:100643552cc011232e0a4bcb43d1b3bdebcea808093fe8bad3c1500cc249563a
Deleted: sha256:e819d6f78c4d30ef8b2398a7d05c05a3d6b5122eb0cdd1c1e7e021d2ecd108ae
Deleted: sha256:a2006b9b46dc9a0797e29fdce831a3549eb78bdd39248fc3e863db5aa9702b4a
Deleted: sha256:b6eb9fe1945c109e4f43d437e2bdf8da7925b0ed015dc79ae3e546b23bffc318
Deleted: sha256:b4c27a773db7e3f965b192c51680f49f6297b38c1e2fabb5dd2d3f752e0b9924
Deleted: sha256:f754dbb4ce5dad739788eaab246bf1875c03cc2e611d179c25bf08ff49d1a6a9
Deleted: sha256:2b8001b39eaecc6df83d43bd646a6944b0d5e8261e5d0d29f2583005d15f359f
Deleted: sha256:c6e3946346cdb520ea2f3327d87b2327a554a7067faf3c81cabe37cc54d1914c
Deleted: sha256:6eeaaa76d780783e1e0e87c77d530a7eb5f8097820b8c9f171051399fad51713
Deleted: sha256:a0a42389b86d4e416b58e69408ec067864330e27b3a3323b72ca420917084483
Deleted: sha256:276f74138ae8d7476b5715daae1e94e79ac6f39448dea7afab4ad94556c844ca
Deleted: sha256:6f3164c2cae5d83a31492fbfeb3a2e6e6e640dc2b6db9984b629fbe60082585e
Deleted: sha256:49ce765de428bd7fcd299ea4f2b3106c2473177b61609837e8b47c0ef733685f
Deleted: sha256:38b6b5eeb3484ac5b3d371c0004edd5754d42e06a6349c8c807075e7dc1242c3
Deleted: sha256:18be53a5c104aae25630403334d6f60722cab1a8b814c95834221e15ce0c8a94
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:100643552cc011232e0a4bcb43d1b3bdebcea808093fe8bad3c1500cc249563a
  Associated tags:
 - 20190704-094238
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-094238
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-094238].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:100643552cc011232e0a4bcb43d1b3bdebcea808093fe8bad3c1500cc249563a].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3714/display/redirect?page=changes>

Changes:

[cyturel] [BEAM-7683] - fix withQueryFn when split is more than 0

------------------------------------------
[...truncated 207.57 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_26_48-15938709806256550563?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_33_00-10256505019860755171?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 793.960s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a
Deleted: sha256:ed0446abf010b5dfb4035282a41d1b0e3938d881f44e5ea52e4ff38479de08f5
Deleted: sha256:358837b7c8131a4e0fc256da3c1172d5df8c96582f921580c24e535931b2aa4b
Deleted: sha256:a427a3ba45224aa880c139ca05ffb8178e22957d8c5f2854b1603d3ef4fec0a2
Deleted: sha256:013a877a236bb64c30270a9386a15947571d4b06be265b5f3b063669542048ae
Deleted: sha256:16f19032bf88cc67b4c7161a80949224e896b37f11dbfe6ffa801c7b30d73c87
Deleted: sha256:595a6ea07b8712f4a61627bdb090b00b52cfb1e3f10e62eb04e4c0752e612835
Deleted: sha256:d1444692228516f4454659bc0e8c4694ac27600f43adc4347e3026754c1ab22e
Deleted: sha256:47e711fab358383c37c26850df37cf3222ed38e6c93c012a9c24d5bbfaa5f926
Deleted: sha256:b4b2c9baedb6b5c96af2254ab1bda17b469e7d84daca151ac4ec9717f06b9614
Deleted: sha256:c220f87ecc0eb530bbb3fb08ffd16c8a94c9d3dd9e5f6a7749c3fcc7892617d4
Deleted: sha256:640a6d5419fc32fdce2381a7dace620cf76cfe8c384251b67a81a3e0a9b4e536
Deleted: sha256:2ed8bf7be26e6ab7a199835b1f925c48280544a628ffed8386b874d60f858df8
Deleted: sha256:f7e60c6b424ef14a8adcc19e84b39f8a19392db0130f06dec0d426ee6da124f1
Deleted: sha256:a0c262d3183cfdc2b442ad1e8e05af479eaf08455639deb563481626ce7c3ce2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a
  Associated tags:
 - 20190704-091914
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3713

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3713/display/redirect>

------------------------------------------
[...truncated 208.23 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_23_08_34-4850423435734311088?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_23_15_37-11122371123735617418?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 773.921s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-060010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:32738efe692bb8182f0bb343478d002bae1a4cd3fca95d1526cf6adb9c63741b
Deleted: sha256:c1a60af86de73580da028cf1982225731ff52a2150c2da81b51412ed6b0366ba
Deleted: sha256:ab2d5cfa83cc09af94fd519f35ebae8393696735f9f4c99ff6fd4435a619af29
Deleted: sha256:1b023a78a8c5aca65f984e6392a0a3c4811d093268ceb8f68bd9ec51e164d3dd
Deleted: sha256:77c629b108950f0d5ef53312cba2c71551df6531daf6afdd49204029ad689df2
Deleted: sha256:cefa34b6484b8ef4e23c6ba3fb8d0269a93d5b043a7deada6c809180dafc0bec
Deleted: sha256:c2a6f35ee1f99d30b8dff0da9d591f5f01eaee9250f57080dc5e5e8720150165
Deleted: sha256:a81f797c6010d0baa3935c2a12d947c973fbee24bd704dfab74ae954b6e6ee1c
Deleted: sha256:37d3ec3addac4e17dcfc21e4f1d68b68c718d5b4628aabd5a0d7e1ec306c079b
Deleted: sha256:7e350ba7c88345d4d26b054900da36be2e304cb90b03dc598fd77600c0dc5587
Deleted: sha256:f95dfa7562967de503347e8e719a1a01761bd09145b6649435f8951c4a476199
Deleted: sha256:4583d98b7ea297b9e410f05bb17cffa4e705f58a8c1669d602dbdd9b5d83880d
Deleted: sha256:bf1465d604f247d3b15c202cf2311c7481d69f52fbaba11039daeb9f68fb8784
Deleted: sha256:dc006918c84110e2f3c2b20406a2690ebcd19a3986ed5c6a8887f9c5ecab1ea6
Deleted: sha256:8fe2b235892f0fe27717bb32ad9667169349ba69dc1b5810d0db1b73735f879b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:32738efe692bb8182f0bb343478d002bae1a4cd3fca95d1526cf6adb9c63741b
  Associated tags:
 - 20190704-060010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-060010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-060010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:32738efe692bb8182f0bb343478d002bae1a4cd3fca95d1526cf6adb9c63741b].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3712

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3712/display/redirect>

------------------------------------------
[...truncated 208.03 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_17_10_41-2658235435766827557?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_17_17_35-8395085879827506245?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 834.280s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-000014
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3f40c9a4eda6cdd8c106cecc98ea0a5a0434678aed4352b38db3616ff11a6c5
Deleted: sha256:17aea3da87f682789c7401ac58ffad6e73692187a3e476049a396c3de4d89198
Deleted: sha256:5e700b728429d010caf3f963d5ba144113f9c3b34e485756300ebebe5ef05eff
Deleted: sha256:51698b273f661a7ed82c68d10c87ff6d71400253b8ea707b8a7e6600f5b5d2ee
Deleted: sha256:e07bd6e1a0df5362ba7ba571ebcf4bafe6c4fba622072bff0c14945d4c79e4dd
Deleted: sha256:607dc50022d50bde9df1ae83164ad1992d2592ea8b627485ebebc9200aa3db0b
Deleted: sha256:3812035e509dc186caa930d6ae59e3d95bb9b7c46a00d9a13390b960e2e76886
Deleted: sha256:72b79a7f40c57865afc936f191381c2ee33d30c417d89b6d583c1ead9ab84cb3
Deleted: sha256:381852c75acbd69de8a68ba301df1d2224be1216b3cbd5ddddb2e563f1d42f83
Deleted: sha256:a2bb2d28f85bbd42b218c0b978b3856422a1f510c74902aeb7cd4a023f20a560
Deleted: sha256:a7a85db47dcbfeb404f9705647c3f7a3a55b3be143fa3b93438ce6944b57fe8f
Deleted: sha256:a418379935c2b8058d4243d836f522d1d048db4d48ea56ac82e026df0dfea3ce
Deleted: sha256:5aead3e762a136e253516835e6841e645ef68202a6804a4fb4366766809706be
Deleted: sha256:75c96019fc05c7c83a35c7cbdb0f59713c9f4385de1d135b6955556c897056d9
Deleted: sha256:e1ab62dd1e1de6e2a44e1ba814af73d7f86195fd77d06bf065fd67bfa766d40c
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3f40c9a4eda6cdd8c106cecc98ea0a5a0434678aed4352b38db3616ff11a6c5
  Associated tags:
 - 20190704-000014
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-000014
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-000014].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3f40c9a4eda6cdd8c106cecc98ea0a5a0434678aed4352b38db3616ff11a6c5].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3711

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3711/display/redirect>

------------------------------------------
[...truncated 208.08 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_11_11_08-9472331082412783811?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_11_18_06-17100805536906833310?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 825.422s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-180246
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:090e589831c4d533b1e39f498b4ca0c8ee6be0a1d4a8149e29a9a83cd8947eee
Deleted: sha256:d548816b5b3225a8f2c16ff789dfe652d8778b42802adfd1d53b4fa8b1942d37
Deleted: sha256:9dcbff1d0471019ae082dcd3ac065361ebf2770510e85b3cba942960bb2f32b8
Deleted: sha256:75bfbf90640edf605ef6be451b1d5bd02e95569ccb703d314a8d000ec25c7eb9
Deleted: sha256:2d84a0e29a29f49bb508044206fe5a2e8d38f85f4a765548338867a5dd3735f2
Deleted: sha256:057022847507ce2c19c40262150fc1a7d11bed42f0b018f52032cb23e5c67a6b
Deleted: sha256:bd7bff5878ce873e4f29bdcd10f1fb3d44b5b50a40bb86ddfbc4b5de3fe54a3b
Deleted: sha256:955223f81ad47656ed51678a98229a70d2bbec2769eaeea6cecab6a1ef9c6922
Deleted: sha256:2e64e6c74805987628e58f759198eded140616ad7d048ddf96e957a1bc2f485e
Deleted: sha256:1182e850ad95da278faa83fc8bca59a5979b2292fa12d6f761146b39323eb29d
Deleted: sha256:76099c1f8be9391fd93e5b42cae1514db06e8e8e911e146f1fa17bd43b409bab
Deleted: sha256:2353f820942b29f2bcc9b0861795a2de1a2d3511d115fecc2667f3a3b4b33fb2
Deleted: sha256:a58a361aa024d9c8c6be42ffa76deb1f617b5c18e796ad7b45f2fb7f5631c5ed
Deleted: sha256:0f37ade3cfcb6aef8e487b665faa284b069495cb9b3535ec3f016c556e0b2a3a
Deleted: sha256:20552ef46dbc59f7808cfd47425facd449ad137c506e4ba42d50762f8ff1a312
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:090e589831c4d533b1e39f498b4ca0c8ee6be0a1d4a8149e29a9a83cd8947eee
  Associated tags:
 - 20190703-180246
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-180246
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-180246].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:090e589831c4d533b1e39f498b4ca0c8ee6be0a1d4a8149e29a9a83cd8947eee].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3710

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3710/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4420] Allow connecting to zookeeper using external ip

[lukasz.gajowy] [BEAM-4420] Add KafkaIO integration test pipeline

------------------------------------------
[...truncated 207.53 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_06_35_18-5030715921131329098?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_06_41_51-5852654497484027486?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 763.913s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-132714
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9a9c02482490f91222c7038b2924b63cec94f72b9523b49ec9a2ce70ecaa819
Deleted: sha256:ef5a88078787910d0221e76f5149174906cd4db0c0c52b62b0848c77b87b2f2c
Deleted: sha256:53fd83ad3aae9f1a88601cb90a53b839e4b9102d3e14bfe8c9a33e756e6d22b3
Deleted: sha256:6baaabc23049a9777cc234eb79cd4a738c5410eebee40c6dfb61392fe1abd32c
Deleted: sha256:d08a864d38673f16c2ac5e30b994671ceccc2e267e6bcc2f7361e17538ce1a69
Deleted: sha256:1e4f87c403326216a83b6c7bb78b419ad5e12300d1ca07e92ddae37ed65a3ba2
Deleted: sha256:d3f03f6ca22517dd01c49a218c0c28bf4499bb20c5ff341ef97dd196e932d428
Deleted: sha256:59f980679ba2fdc31db9f4e607de43a91c43f877ca2f56bc34a02b6ec1927abb
Deleted: sha256:a647a673869ca67628274ff1d87d7977ef7cd54479c759608aa6d9d80dc52d66
Deleted: sha256:9a7603cfa80f0219c9bc180cfe3815436127d094ad9ec6bca66b96fe32519373
Deleted: sha256:332c8a72407fd7c68304c9711d34d4a67c20433d54121efebaa86f2ccb59d24d
Deleted: sha256:3ffacd5a4745b97d114c8e0b4466ead9cebca5a0d24bdf94708f382047e70f2f
Deleted: sha256:ba92ba88cb7491ff302299cc0ad8a7b4cd58a1c6d50428c2073c0713cd09da96
Deleted: sha256:5fa79ce3baf88b8e768451ef1f6ad9fd4ffae162cc787cb6a8a8e5e2939a3c78
Deleted: sha256:f6e770c24bff1fb797172d6a2b3e0e22853a7ae7826a505f72aba882c8412c99
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9a9c02482490f91222c7038b2924b63cec94f72b9523b49ec9a2ce70ecaa819
  Associated tags:
 - 20190703-132714
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-132714
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-132714].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9a9c02482490f91222c7038b2924b63cec94f72b9523b49ec9a2ce70ecaa819].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3709

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3709/display/redirect>

------------------------------------------
[...truncated 207.78 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_05_09_06-8987770049284610358?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_05_15_40-10014152392539054298?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 800.483s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-120009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:59afbef89aacdbf259d156ab55e422cefe0eb77a4c0c657fba2097e479017307
Deleted: sha256:385b24a5a9c5c44721b8766a5d3e863c1225aca3f046c55fd0abf1a3866ced51
Deleted: sha256:ebdc060b30906707e4e0c04f77f548d05ea6df2ce95d8ffe050b6696694f6dff
Deleted: sha256:af74a8506739de637725110ac10bceaa9a968bce463bcfa577f9a96f1e57e50a
Deleted: sha256:b4173f47fd9d060c7d5b625441ab98fde35f85764da71ae3d597cc06bb4150aa
Deleted: sha256:0d8662f702b5a72e9cd0be7f828ea8c6a64104288270d479da05ee5c397662fe
Deleted: sha256:e58e667d12f477915cb76c28fdf925437f9b9ee29d272d64c0fd9304a5f83400
Deleted: sha256:6d3678e5d7c1fea3ad3f983110a3c14af92747a77834dc2ebf9596cdc5df13f9
Deleted: sha256:80525fd5c25c4271497124bf30cccd3699363e338920ecf1af022d84291505bb
Deleted: sha256:509f178dc75757be8e6e6ae45bf6f8e9a95b5c2c9c533ecf2ef72c455c533fac
Deleted: sha256:3c049da047721bff89e16808e80d9fb66922c1580285c2042b34fb3ada7619f1
Deleted: sha256:eb32709ca125bde8ff6c2c84d720c7bfb79d14e0646254b35bbbbd0141ebdc5d
Deleted: sha256:dbf4e35e776e828c2eb5c19b7a2b6d75371fc663b2a08c3d3e8ae253d4710ff1
Deleted: sha256:71dd0ed5a5f5f590d58515df09c1998cfbec54d02d060a0c3e3df63bb9d8d9b0
Deleted: sha256:4f56c681d25714d724b65da34520a7172004d6ac7b4ea13603eb6f5f54de0e37
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:59afbef89aacdbf259d156ab55e422cefe0eb77a4c0c657fba2097e479017307
  Associated tags:
 - 20190703-120009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-120009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-120009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:59afbef89aacdbf259d156ab55e422cefe0eb77a4c0c657fba2097e479017307].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3708/display/redirect>

------------------------------------------
[...truncated 208.37 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_23_08_26-9004033485177574791?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_23_15_04-3701900803582736607?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 789.947s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-060013
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:576500b2ff11e126f3a1ce1dafdbbcf49e8d31e1d5584113b58e79e0577cae38
Deleted: sha256:b7892cdd488fa755d1aedc268de78c15c973f4dfe7b6c4b536c01b751f5c2949
Deleted: sha256:280a909d14b97b4ac6624a0b8005f7e046d16b9c366b961ff4b4018efaf856fb
Deleted: sha256:28fb905f46adb0e2e126916f714b898a75646bb3bbdfedfbdadced9faca3992c
Deleted: sha256:2c7e67784a4598c40ef81339d7d50ec4621076892c67a54e403d31319e1015f3
Deleted: sha256:0d955c3685a683c7836c738fa1a6c463c51b761448fc1c4c0a474d4fe41bc567
Deleted: sha256:a58548e12a1006c91662223e4a0896134dce4bb7a7616125134fd2041b1ec331
Deleted: sha256:ec3b2a218e97f874720c337d0b2bd51362f801b32c6246aadb74775e324c6030
Deleted: sha256:99830024467faac658087c55e63e3c47536fd52e9787c0b923f56a4cfb18baa5
Deleted: sha256:aedce23eeb2e6f11b0c75c5e01e47d4c500c6377e0e37e6ccd5ff6cdf10aa01c
Deleted: sha256:62ec5d94997db4f56eae38a753ca3ca00f44edbdf9e39d53a363f9357379027e
Deleted: sha256:2ba43b0c845249575fe9fe503396ab091ec3d8141f3ef158f2cbb2716d94a245
Deleted: sha256:33cbb63c004af1fb4480ef74ddc3829865fa047f44faaf39200ea1cb40763b2b
Deleted: sha256:562291ffaeb2f8b0b33ea844e3060035abe2480868375ef97ac3e7cb17b4b5ca
Deleted: sha256:5c7f923712bc96db154329989ee4e41732076c76341506aa2e708bb790f24927
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:576500b2ff11e126f3a1ce1dafdbbcf49e8d31e1d5584113b58e79e0577cae38
  Associated tags:
 - 20190703-060013
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-060013
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-060013].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:576500b2ff11e126f3a1ce1dafdbbcf49e8d31e1d5584113b58e79e0577cae38].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3707

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3707/display/redirect>

------------------------------------------
[...truncated 207.39 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_17_07_24-7151685985290410929?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_17_13_47-11217322693237963728?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 789.840s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-000008
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:598863df86fbbf56cc5b4309f76e449fe3370fc19a0d16b620dc713b692d265c
Deleted: sha256:d1c38b6df6ca66f2d65eb4c85a780502bab6c21faf3e326b924cac8e92b46a95
Deleted: sha256:5d54fbec7b156a26a60c0bee3c25ec3b16072fb835b9af5f9fd40d16bb6473b5
Deleted: sha256:68b69469d168e8b16d93ac0a61085babafdeb1674e89d5d0a5d78f254c711aad
Deleted: sha256:19744594fb06481bbe398f8fca174cbbccb365b58d059c08d776df02cf28b0cc
Deleted: sha256:1f6b42aab0c74d9154eb8b6ab2644c9296d8448592ed4a8053b304155f0f5c2e
Deleted: sha256:012ce0bd3de63786f23ad6b688a1bf2528f9f4bbfd00b7b6e34459dde8edf23d
Deleted: sha256:a65eebfe768014214e83e94d17fc501f0e08b8a26db69d180341031f242d67ee
Deleted: sha256:16363a8788f84db8c0c68fe2839614b69d02c3651912b5c4af148507e1cf6d21
Deleted: sha256:4b8eebe3ba09db0e08fc702046a1d6a513be1f30a50c80c5ea3459e2108685fd
Deleted: sha256:8929d8d3d4aa0529c8897a1c25adb7cc2f16aefcf7ae2f4a33691ca2e9d90a73
Deleted: sha256:f1a0afca1688f46f2ddd047f4a489ae679eebbad63d76cca8a38473bb67378c5
Deleted: sha256:acafd95cf2369c811ae5d734b0a7ffa5ebdb9fe48a1c7eca91e62fe6ba22f470
Deleted: sha256:5d5bcf5ecaa26d08c3e3d902708cdc24b59cca347c3859128c8c175acc31b6a5
Deleted: sha256:5f27429ffe1382e5c2a7d34047ddf35b1fb149941f74db050f614873045f66bf
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:598863df86fbbf56cc5b4309f76e449fe3370fc19a0d16b620dc713b692d265c
  Associated tags:
 - 20190703-000008
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-000008
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-000008].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:598863df86fbbf56cc5b4309f76e449fe3370fc19a0d16b620dc713b692d265c].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3706

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3706/display/redirect?page=changes>

Changes:

[juta.staes] [BEAM-5315] improve test coverage bigquery special chars

------------------------------------------
[...truncated 207.81 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_15_10_58-14950782493778738380?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_15_17_56-10994450702539444910?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 854.534s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-220323
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2a1cd5709c3c76f302bc269e9c5504ea1c7aa0788fc99c4cbd72adf7d4ab895c
Deleted: sha256:1ee0707e5098cadac47110ffa684daf90b86f8d9fa5042eebf7b48f901af6149
Deleted: sha256:8edbff6e4a0b05060df68bb23c97fc94610989cae1794631b16477637e904502
Deleted: sha256:666a02c25e00dea430ec0ec815c06bd539c70b74169802f930a34cdb07ca2b69
Deleted: sha256:6b3b232f6fd29cd00e210eba71cc08f75d6f4b93f7ff18ce0af1b8aca79e4b4e
Deleted: sha256:185c83bfaed634ce2174e57cc23052b61e91e3d4d24c5729c0459a0d8dd977c7
Deleted: sha256:e1ffc7786b6e454e55f644b03f7259ebae768875b89cbd0a15c1723eee093ed3
Deleted: sha256:b473c390c064ecffa68572b0128d40d35ea0d72f20955cea6472e9cec4d9b7de
Deleted: sha256:87c89f0f08bd05ade65b58693ec01f21552b6b74ccba47718c55369a024a9128
Deleted: sha256:ef7673aba6b30688e745800a2033eb428a9701f2a285b954ca68a4c957c858d4
Deleted: sha256:0c48d629c866270ac4059a075aaaf2ebcc51f7e44ca6d3495ff40c3b3dc1969c
Deleted: sha256:11fe30fd180e2b216bfada3f647011f767921f53b4b3fdbb421e28492e14f357
Deleted: sha256:adc996a1dec14e7aece2a6f4f6b902239831e489f665ccbfb82bca70a803b759
Deleted: sha256:0214bb82a476649967588f1854dbf9cf64f76ba203e64b7516f6915027d44852
Deleted: sha256:abe3660faaa48ba8b912fd0984f0b10627046c9642708a2db52a11f4adc78ac7
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2a1cd5709c3c76f302bc269e9c5504ea1c7aa0788fc99c4cbd72adf7d4ab895c
  Associated tags:
 - 20190702-220323
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-220323
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-220323].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2a1cd5709c3c76f302bc269e9c5504ea1c7aa0788fc99c4cbd72adf7d4ab895c].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3705

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3705/display/redirect>

------------------------------------------
[...truncated 207.30 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_11_10_10-11107943754467233090?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_11_16_44-9779842156111470113?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 826.072s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-180208
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:4444800c7ac1e88dac85f258838c3da5e5e332ecf71fdcdffdf19dbe8e548280
Deleted: sha256:afe7884705f5046b1a0d922ea8d003df9741f50bf57fdf15d8a68fbf21a42da7
Deleted: sha256:e76b103d4fb8ee0fba438213535227f71ebaab1495ca353cffc8ee906a79bd2b
Deleted: sha256:905208c71b89182b9d990b65b2ff05eddbc2052552f38a1c0a5daf6b08cd1a1c
Deleted: sha256:646b182142bf9dad1b75e3b7a6eeb727ac2561ef90be862c2977f85b1747976f
Deleted: sha256:bb5d18fba12981dc0fa14702f7451d1c5ade3edc9be038c11bbb4ed5a14c8e66
Deleted: sha256:4b4076aedbd0638e2bd18f982cbbc17e7bfaa75ded2a7a54d75155df19ba5fdc
Deleted: sha256:a4733fb4fa1e29a0c1dc642ec757120828272f0c88bf42cd0234c82be4802698
Deleted: sha256:5486ad8178602e51612d0b977b6ff006253f96a8d477f8d5ead510a4e7815676
Deleted: sha256:e8f2a93b8975efafd4f45cbde2570f7125f3e1cbf0da1e88b29c2b4ab9e34f50
Deleted: sha256:075ff23a27bf4d71e31e5a8a91496ed77a57ebe6658730294c929c8c830c8816
Deleted: sha256:2e4a661ebfa130929e6edb1facb92bbae454af7e49e4749653381db4e77a7c29
Deleted: sha256:f75320daefe512c6fda24b91273d3498d2184b55b1327d169792169990dac0d2
Deleted: sha256:c4d1c5ae511a38ef9075fa93068fbb5e407b35fbf9d9610182459996613a5780
Deleted: sha256:bb1ed1014d3c78799bf0b88b5c62ee6126fb3e7323974f38cbaf8a6ded3ab255
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:4444800c7ac1e88dac85f258838c3da5e5e332ecf71fdcdffdf19dbe8e548280
  Associated tags:
 - 20190702-180208
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-180208
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-180208].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:4444800c7ac1e88dac85f258838c3da5e5e332ecf71fdcdffdf19dbe8e548280].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3704

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3704/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7504] Added top_count parameter

[kamil.wasilewski] [BEAM-7504] Create Combine Python Load Test Jenkins job

------------------------------------------
[...truncated 208.25 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_10_05_13-12516357940795460149?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_10_11_21-669043431429153585?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 768.829s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-165755
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:df2f3bca1f5a24e6349101d77f432239a20c2d376c6926c1e863aa0319b0e7c0
Deleted: sha256:7daccc87bf472a28cdaac47351b4b1a864f6b62dba3719a847c3324e2a1c0c21
Deleted: sha256:4ba98c553dec1b28695c5510fefbd059bb3cea7818d5f93413aad515c06f7074
Deleted: sha256:47f1782237152d8bf55c0fc19863fb207b15fe7d335c24f1d1368d13a326e0f6
Deleted: sha256:b57c1d9aa30eb7025e4cc8ce7d6c50de39ca6bd3c472eadcff342db313a5fa09
Deleted: sha256:7c6af6a321eb9e86a5169eadc0815d8ce0d9035064e5904b18066613fe6d3233
Deleted: sha256:41b3cb9341ef7957122b71ce1069052b09c53d35cb1cb601ebbf47f865979580
Deleted: sha256:0c6faa6b4291fae395f15224a60341a07b699da03afcaa1252516bd0b9b8fd4b
Deleted: sha256:51a33eae07de99cabbda61058606e4cfe1e8a5edea594aec0e903ecb3f3ae039
Deleted: sha256:2da773acb68be59732b90159dc8530f2913d4d9d39ccf96d1d3a3e46bc9547fa
Deleted: sha256:0374f77fe7682f42d7e4c152a9c38dda7d71425a9e47c9544e0716a8d7d4d4d8
Deleted: sha256:6f749914ec17347c9e0cac50b57843469e68cd357ef4f5953f734611a27fc70a
Deleted: sha256:23d6c3d53155de234b12d92a0ad423779203d2503d12e4de7f69dbe1db1b3165
Deleted: sha256:47cb1b27537a1466de7b210f9e55a897bbfd6a39a67bc20d4a714a1876a13f9f
Deleted: sha256:a9427185ea2260922caecfd96227407aee42a0f8b8258b03c5e9d205333b6ad2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:df2f3bca1f5a24e6349101d77f432239a20c2d376c6926c1e863aa0319b0e7c0
  Associated tags:
 - 20190702-165755
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-165755
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-165755].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:df2f3bca1f5a24e6349101d77f432239a20c2d376c6926c1e863aa0319b0e7c0].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3703

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3703/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7640] Rename the package name for amazon-web-services2 from aws to

------------------------------------------
[...truncated 207.52 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_09_14_57-17861540338672109338?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_09_21_16-415108671860983704?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 735.160s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-160508
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ae13728f56a0cbf003e4f0a510df3292afddeb51f32493e8dfb7ec85aab7532b
Deleted: sha256:cbd40ef02200ad2473761de64060130e50a163a6d1845beaaa68dde94247ea97
Deleted: sha256:21412ef4c5cafcaf639ca2e00400e38bf509b15b782237edd039e845ea30fa03
Deleted: sha256:4979145dacc35ad222d5eb40243623e271bf6fff8a2a258e3a71719145683af9
Deleted: sha256:11263dac81cbb269d1e27964b6595822dcf15e033beb227065f23c550b335ae4
Deleted: sha256:29b84ed3e898014c9c3719fbd3c18e28f86ef0a89c24c1b4ab24f378146cdf22
Deleted: sha256:944dfdf2c9cbd240882bdbd4b2f330da1f4be441748bf4cea86f8c7050c29a3e
Deleted: sha256:5b7ca6235f3154ddccfdb71ec135c93f16d021ac655372c0d01822f9c67d47f2
Deleted: sha256:7f5aa52c079b4607e66849c3b65e2645c71c0529f02a41fd68a559ebcacccb93
Deleted: sha256:d22a5cbbcd034acfb22f4fd7c85f26a453b48782bb673af0fa3fbad91ae250c2
Deleted: sha256:45526fbf6464c46a22cc7e3a75b535ea2af9549f7b29237251a216573548322c
Deleted: sha256:9e5fa2b9fbbed0ea03aec5ea3d95f4a1f86ca87a96c8848b82ca5590722111cb
Deleted: sha256:6efbdcb01631ff262d1938b2af60037c89bbe044a000d75f55dac817f088a1e9
Deleted: sha256:76a73ffd2cd74e12cd1176f70009831fa43960e41293d63a751f1bed47fa0f66
Deleted: sha256:65573fc78a589e0642d73f8ffcdf3745fe5b851f4e3acf55cf49e9e2e108174a
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ae13728f56a0cbf003e4f0a510df3292afddeb51f32493e8dfb7ec85aab7532b
  Associated tags:
 - 20190702-160508
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-160508
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-160508].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ae13728f56a0cbf003e4f0a510df3292afddeb51f32493e8dfb7ec85aab7532b].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3702

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3702/display/redirect?page=changes>

Changes:

[daniel.o.programmer] Update python containers to beam-master-20190605

------------------------------------------
[...truncated 208.36 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -414: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-356\x12\x04-354'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -414: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-356\x12\x04-354'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_08_40_05-2219288925707518971?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_08_47_09-16003912165937940398?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 804.452s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-153207
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:7ae85fd272783669493e925f171ebd6c9f750a287019439abbd136e8f86e82ec
Deleted: sha256:304948ed198acc740405dcaa0aa243bffbdbc7853248c05aeaf64a56bdf82a6c
Deleted: sha256:d314bfaf51df469713261b4faa8330ed3152d219cedce6a1d287bd86ab9f3143
Deleted: sha256:b426fba5d910e39b99e74f637b158124285c69cd2703464a4816d4715b0d42cf
Deleted: sha256:416af233bcebb2f7551071db0e8148c7b491e898dc3546abefa7ca3c46fccf3e
Deleted: sha256:20949fe8fdab2454332086a04ecd6cf97096bb4132a9c5df614a9207d9b64aff
Deleted: sha256:0609457b7db74e34f618c0e434b1fe705cc751261d703101d49b6b11826a1eb0
Deleted: sha256:11efe9e8c662e558f065f0ca02262e4d1aa4778b71d0a7ec007e7fac0f59f3ef
Deleted: sha256:13957d78e9c4e88545c3e508f686be211793bc79bf23986a517403cbe491d66c
Deleted: sha256:22985ee9deac2ae95ce18e84c570f539608e45119930e521eb12bb4b969cb336
Deleted: sha256:16db7943fd901c6b743d917c821f9d44358e32a585d87a680f5892f2c884703b
Deleted: sha256:df9deaec0a17ac7dc202d6bcad67750504c742e4bf4c13235c87d22a78c7a106
Deleted: sha256:d770238c715ca36d4fd711e4edb048a9a9d53ea69078ae16e33b87e3fe1577ab
Deleted: sha256:513e886f1cb31016609ba9743b13f064fc11f044d18c1dd64629750532ab10b1
Deleted: sha256:a56ae218aee6e7c33f4855066b4bd54a1648b4bc212066f13dacd01efcaf5f89
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:7ae85fd272783669493e925f171ebd6c9f750a287019439abbd136e8f86e82ec
  Associated tags:
 - 20190702-153207
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-153207
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-153207].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:7ae85fd272783669493e925f171ebd6c9f750a287019439abbd136e8f86e82ec].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3701

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3701/display/redirect>

------------------------------------------
[...truncated 208.12 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -379: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-321\x12\x04-319'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -379: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-321\x12\x04-319'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_05_18_58-8606748156447795834?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_05_25_17-1767413820336237218?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 759.096s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-121148
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:e0105589b3d0a126d38d45887e2a8a4a19495c4caa482c806f57deaa3d9ab248
Deleted: sha256:093239b013423284e634bbb8db542df7b800adaeaf009597a62a844ff216e7c0
Deleted: sha256:1f6474e593446db3e0cdb78736e6e956e426b2b28b49ed1a825564c5cd4699cf
Deleted: sha256:e61c1b79c7ed77a0c19c775dc2631aeb456abd7ee262df9f2433f686f4c3b43d
Deleted: sha256:0dec01143ead7ac8e1b806b21cf9401eb76a707858a860aadd681c723b8b903b
Deleted: sha256:f6b86d1c606739d0d7cb889e5abff5e7ec29ac7a6d4e0f549e6c0b728b72d5c1
Deleted: sha256:b328578e5db710cae450b0857f7db95fcf47e5ef3b6495b438520e604c3a905d
Deleted: sha256:7f7efb7719a501391ba50351c383774b2d56f8afb822e4a8a6a184c708f605c0
Deleted: sha256:aa42c77f60e6ed22381c6cb81431577c366fbac43eaf7d323165bd2f046c88bb
Deleted: sha256:456276da7d158343704fa6d041b5351d05bf2b9b32eeef28349c8d01d071fb91
Deleted: sha256:f2e430ab53ddc98030950962364baf586163054cc4042644abafa8e2b4e96c18
Deleted: sha256:3d4fa7f15898275aa5dca6c0bbc61e5bf38164f7949590879908464e4942b11c
Deleted: sha256:96e4c17add6e2cfc762bed0507ee68bf331c7bf02c8d4730d1df07b196472a84
Deleted: sha256:2232a4c95859bf0b8144984324f1f664152730544de0c1143a24fb5bb3e1a5c7
Deleted: sha256:5e08e6cfface0427a1a541cc383b567ff47bafe39e8dbbbad78b448c23eea80d
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:e0105589b3d0a126d38d45887e2a8a4a19495c4caa482c806f57deaa3d9ab248
  Associated tags:
 - 20190702-121148
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-121148
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-121148].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:e0105589b3d0a126d38d45887e2a8a4a19495c4caa482c806f57deaa3d9ab248].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3700

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3700/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7536] Fixed BQ dataset name in collecting Load Tests metrics

------------------------------------------
[...truncated 207.92 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_03_37_12-428570342762003146?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -131: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-109\x12\x04-107'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -131: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-109\x12\x04-107'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_03_45_10-8821825635959112573?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 954.097s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-102931
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff50100c8711b54505ead1b50b035379d203da5966d8e2de29d41a4487a9f624
Deleted: sha256:d186cf5faae955a9a535bddf53a92867c83ff865cf4e6969dc1365fc04dc446c
Deleted: sha256:7f146be85e31967186478f5f26a62e4d451917713c05bd7cb84e469a06940bec
Deleted: sha256:6cc4340268b39c8b8f44739376afc14ca9394b2419f212f11d4e370d6452aaf5
Deleted: sha256:00c5a334d915123d311ecf506828944dde1fb1a8021fd8cde893cd5f4cebe1c1
Deleted: sha256:076937ed4777bd5579e991bdc8c8628e44b4a34570618f9fd199c984c7ebe564
Deleted: sha256:3581aa9fc578d7ea6811da4408d294cea68ad9918bf77c95e2a1f1edd7f87b19
Deleted: sha256:bff623e455f35808bf29053fb1f4e6bc64f2d34d63dfb3fb1304919c4663e383
Deleted: sha256:a4075a419aa7071da83423a48231baaf3484eacb5dbff259e951f2f57d72be21
Deleted: sha256:d92e786a3ba9ed33b988aa7e0ae3c0d9ca2b5f4690757f655f577be78111e200
Deleted: sha256:4b318d47369ef1fad6f87d98786b8549fddc9b353ba4a779b9ab9f3e90e0667b
Deleted: sha256:6222f036baafd98330f291729fcce3f960d399d6d7094837fbb1c3e8e15b1b92
Deleted: sha256:6ef73b180f415ddb4bd4b4a1635f1a03e1454e8471ca7a52fe7d2b0eafea44f2
Deleted: sha256:3181e940bafc365fbdd761d5b8f10d181fa50d4660a344aaa8cdcf93711cc27c
Deleted: sha256:0a19cdad36b485b17cc889484ac934eb12b27560a3aad20d6117fdafdcdc0510
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff50100c8711b54505ead1b50b035379d203da5966d8e2de29d41a4487a9f624
  Associated tags:
 - 20190702-102931
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-102931
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-102931].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff50100c8711b54505ead1b50b035379d203da5966d8e2de29d41a4487a9f624].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3699

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3699/display/redirect>

------------------------------------------
[...truncated 208.35 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_23_09_00-6522216733531613032?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_23_15_38-16798764953135447462?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 759.003s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-060023
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:be6df43081201e155e10922e14996eaa9d3a8c331c164197bdc41113290c3af4
Deleted: sha256:65d8980123311d1140b92c1a0d0e5691b86fd2fe67f687d3faa442726172e9b2
Deleted: sha256:c696ac47d42e502727f269c1dea8c5bb9893bc59bb58539b00424c75a0866658
Deleted: sha256:7594ff2a57755dfdce8289122874f0183e49fcce938d4e49ea2127b16c0ea782
Deleted: sha256:6eead157217d01d39eb6cdfbdbf7c99385939922d8fca429c4e5e9c0071ab5e7
Deleted: sha256:3717d2f0c8798abf6c081065d528986fbbcb2059aa66601f1375bc113b5c9a4b
Deleted: sha256:6ab4191601820c7da6214672e3abe90712503bc860b7b3a7c1742714d55905ea
Deleted: sha256:77950f714de0b2b2995a4ce0bee10628974cf7bba6f160c34a8c7342dcabc604
Deleted: sha256:f52f90b09c499c82fcaffdd94fbdaea1c87ad1fc0d713f6ebd4d353396b87dff
Deleted: sha256:1f9dadf182b3b8be7a080acba2f90ba998abe040d068def2bf7e4fe1d5d388a4
Deleted: sha256:a91c1868fa4433ced86f7cccb095747db3a80c8d6e2e9f88a8a16476cfcb53fc
Deleted: sha256:e5186993fa8f85e4866f1df7abd7316fe19f30c8fb86d056714b72d59ecee34b
Deleted: sha256:7b65b8fdff3b7f4cbbacecde46e3bdb26adea247a44df313239fbdff8f16b06a
Deleted: sha256:e0c191248619fadb30ddbd39f579084f1cdcd72cbea8ed805a95a4ccdabe2928
Deleted: sha256:4bdc5b374f203bc6ab48a9c0afb3c99dd800a0f238167cae01daa0fe5472d85f
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:be6df43081201e155e10922e14996eaa9d3a8c331c164197bdc41113290c3af4
  Associated tags:
 - 20190702-060023
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-060023
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-060023].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:be6df43081201e155e10922e14996eaa9d3a8c331c164197bdc41113290c3af4].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3698

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3698/display/redirect>

------------------------------------------
[...truncated 207.41 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_17_33_36-2246884866252425506?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_17_39_54-10830800617907297680?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 709.761s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-002544
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2be9dd9910ee246a85489079e5c76395743decc7b4eb4dc02816b9f0841c9ae4
Deleted: sha256:a9a48cbf669ab0013c5f0aa46b08c3a2f4a80b14f63591136383a36d3b3a329a
Deleted: sha256:4c23b3c4ce057ca76582104f7159910e8cdbb9d06883c03117e64a83c6950ec1
Deleted: sha256:73329a543cc25a19b69b7c01c84c6c7c35db5ced1b7d7e2950abbd94dc9355b1
Deleted: sha256:8274180a92a2264a7290ccc14edb16a3995b89ec0de0e6259c99b4c28e9cc314
Deleted: sha256:a74f4c4681a9d61f82127e8b91628c3f74db15c2234d0adde3045bb97c4ae4ff
Deleted: sha256:63581655b0d9f33a23a8951e2f02bdcd23bacb2ee7d5bb96800c787fd933fcaa
Deleted: sha256:0a74f149a39dab849e5478ff8878992532ced4db93207f22476a3d499dae6482
Deleted: sha256:28784da26d9a3489b2e94485aea4f4508d3dc1a3d494adef9fe5d1eca32bf13a
Deleted: sha256:fb20f5c340e16a7b0d8b711b650d67661bbe0821b34b797a5580570cf670bab4
Deleted: sha256:c7013dc3e8d7724d777b0fbfca2a83100a8376cf7a4abec4539f259af96e68ec
Deleted: sha256:530e4970c248c361026e46112dafb7713a0f775ddb57cfdf64c87f5676f72111
Deleted: sha256:bc7e0c6853ae678823c8b73ba129ff644ebb52770a984cdec2beb774a3a47bc8
Deleted: sha256:a768204c0e83531077f6c073923bd2a6c454df5bb9590164380afb431f5ecd53
Deleted: sha256:42eab8683067e7a9a8c427a5f355c777dffbefdb2051937ec671fb0134b1b4b6
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2be9dd9910ee246a85489079e5c76395743decc7b4eb4dc02816b9f0841c9ae4
  Associated tags:
 - 20190702-002544
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-002544
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-002544].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2be9dd9910ee246a85489079e5c76395743decc7b4eb4dc02816b9f0841c9ae4].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3697

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3697/display/redirect?page=changes>

Changes:

[github] Tiny typo fix

------------------------------------------
[...truncated 207.80 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_16_58_24-13100792213788527829?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_17_05_57-2180223769591991146?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 843.787s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-235102
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:0af072e749caf16306aeb4c1568ac6ed23c5b98468be70d2cfd67cebc201eae9
Deleted: sha256:0d45a988f071f0d8306d6195db7d9833e0090091781d1bb9c916897e6cebd6dc
Deleted: sha256:a28b649a41040a56539031b685f38405a3b7866837fea1e089143bd6442f2aa5
Deleted: sha256:bb4fa1bf81e800a093a36ccfee220aba58229404dab85dd753ae2468d82cb31b
Deleted: sha256:25724a22d666f8cc2fdc951fb1ce28e5e58743f01cb500c2bcdc8a115c4b2e49
Deleted: sha256:95d28f0b444f5a41d2986a1152211427245dfc417cc1dd8c28ed82ad27c96de9
Deleted: sha256:40fdce96099fff9de59ae5289b34fc3227f4efe52c75a64e946c74038e79eb88
Deleted: sha256:f19927d98f92ac6c06f25da9504f3251d12867c3dd99c4f57a400caaefc8a9c1
Deleted: sha256:fa2c3f3254450d2a5482f307d7ea5412e9ae542a05138fe8e1f6611248277bd2
Deleted: sha256:bf0480ad1e2ce32f8737d80c731d5dbc02938e513140694432f711cf69e3ed91
Deleted: sha256:995cbeb58dd2c89d6d31c47e1903bfaf24e49af8e2e961035d4c09bdd06f23cf
Deleted: sha256:3594402fc4e848d23a222d12a02992944b79c63170b8d7e0362fa9394ceaadf5
Deleted: sha256:77a106bafb6a9b28852301490bb2835fd7ce53a7c730d4f8f5f03556027c9485
Deleted: sha256:11fe9f27d6777d812f193244993c0a42a34963fff32ac4b8ec60c6fec1bca98f
Deleted: sha256:9767e360b60c73e18caef1caa379b8e9ee984f813093039e75c9ed5cbc0b40df
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:0af072e749caf16306aeb4c1568ac6ed23c5b98468be70d2cfd67cebc201eae9
  Associated tags:
 - 20190701-235102
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-235102
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-235102].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:0af072e749caf16306aeb4c1568ac6ed23c5b98468be70d2cfd67cebc201eae9].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3696

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3696/display/redirect?page=changes>

Changes:

[hannahjiang] BEAM-3645 add ParallelBundleProcessor

[hannahjiang] BEAM-3645 reflect comments

[hannahjiang] BEAM-3645 add changes from review comments

[hannahjiang] BEAM-3645 add thread lock when generating process_bundle_id

------------------------------------------
[...truncated 208.64 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_12_47_27-7352719071507165178?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_12_54_30-7919707964121907924?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 829.663s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-193823
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff8dd37e5990e4d0368c0b07865bfa1b7049b0cbd7274b04a62f4597c90670ac
Deleted: sha256:ce609ca359b2f003d9b5167ed26d2b5e0143b51022e002568c74efa419ecc73d
Deleted: sha256:ae6e1321ad665aa9bdfcca3e505a9b54f7ad13835619f3b27ae9230ec890ec7b
Deleted: sha256:d52a3fdece78c33de83d84546fc41a8d81db253babd8e7552456d840ae8d29d3
Deleted: sha256:b56cf30e923ecb9b176d0bac6dacbc830da8d9b1c937288a673724f0c7584edc
Deleted: sha256:d7f308ac735f7a8db43ba3d27cb741834ec957a6f428c3faabf303dc614b98fe
Deleted: sha256:6654175892c5b1cc8268d4fa161817e1aef60529f2129c3c5418cafe99338067
Deleted: sha256:ada89d9284487f553f31c099239b431e9b0603cb62c8e02e5b485477894c22f5
Deleted: sha256:6860ab1420d30a742e0a1f6ffdc58e75f4fbfa505daae6babd8674ce051f5047
Deleted: sha256:dc99654ac4315930547c24c2bb744225aeb8462a9c61559fe98e4a7e42d9e29e
Deleted: sha256:29acd9fc8dfb7ca3089f0f6b56cb26e3124abfd6cd958481834452d100f9421f
Deleted: sha256:12be22230d37b0a5a0e025c4919f7924d2e4d3e532d8ee327d6d35e9af56dac5
Deleted: sha256:b5256b0f64dfb01774f77d2f2766cab0e6b426fa06b70cd1187031ae5dae62ef
Deleted: sha256:b8de197027227a14cf95f5a62a15a0ed0606ed8edb4518f99246517438497743
Deleted: sha256:59386f13982e04441c570f223765742bf204cd72aebbbba66f6f186a0ce0ecf5
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff8dd37e5990e4d0368c0b07865bfa1b7049b0cbd7274b04a62f4597c90670ac
  Associated tags:
 - 20190701-193823
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-193823
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-193823].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff8dd37e5990e4d0368c0b07865bfa1b7049b0cbd7274b04a62f4597c90670ac].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3695

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3695/display/redirect>

------------------------------------------
[...truncated 207.81 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -446: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-424\x12\x04-422'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -446: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-424\x12\x04-422'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_11_44_30-17571315390522134697?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_11_50_53-7281847916252296748?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 889.370s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-183410
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ec6c7ce1777c4715899e3c832d7f2633fe55733d866d6f5e91d25af2adaaacd2
Deleted: sha256:58a00ee2d847355a823dd25005cc59b6bfc43aa8a451375d978789d25ea6a4c3
Deleted: sha256:0f3aa4826b0f3263f3c004b7b689f995f72bf527c999f3c1daf09c77a2927c0f
Deleted: sha256:99a78137d0cc1c46ed427e64adb635459447595b6036e65642feb3a7493d31ae
Deleted: sha256:906494a3329283cdf489e39ec2b4c5c2cd9990f86ca02fff332ea4a2065ac434
Deleted: sha256:d6bcbdec1b885c5dee210ad6eb96751c0355e5f5d1e16016da1f24f3c7b30831
Deleted: sha256:9d34a4a4ece4320e9ee46a51734d4145dd42bc7378b60d1c4a0f434ce90ed8af
Deleted: sha256:2d258aed96e6cc57356b5ebb970674de82695073b80b8aa43bbc3de07270809e
Deleted: sha256:aa0edd4b9688acdcede2aa22fce6a8d245fde810dcab4709d2c5bf14e05caa29
Deleted: sha256:7d4756ab17ec614e7bc485b5f2b4cad2e3fd37931394ddf5b3178f594e14cb83
Deleted: sha256:e22ce3d2275639b62e6af70908bfa715fad82240fb0d7ad3b49358523a590a7b
Deleted: sha256:4c57d15dc3b5f2d6fd26dc2071750d0e27e69c4bc8ce59827796686fec9fcdfe
Deleted: sha256:d5a933b024e7426b022fb33617254c59268f1e299bb942e6a19f2cd28b73d323
Deleted: sha256:b585c13aa453172273bcc741030e3dfbe4b662a0ef5ba1f2c6c0ed6a470e1800
Deleted: sha256:efc91dd9fb4ff6620246e4c512b80d13955742af3a370509759b13daa7c26129
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ec6c7ce1777c4715899e3c832d7f2633fe55733d866d6f5e91d25af2adaaacd2
  Associated tags:
 - 20190701-183410
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-183410
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-183410].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ec6c7ce1777c4715899e3c832d7f2633fe55733d866d6f5e91d25af2adaaacd2].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3694

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3694/display/redirect?page=changes>

Changes:

[alireza4263] [BEAM-7545] Adding RowCount to TextTable.

------------------------------------------
[...truncated 208.00 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -317: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-259\x12\x04-257'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -317: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-259\x12\x04-257'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_09_55_40-8587369215981780305?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_10_02_19-15799174409797617061?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 799.438s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-164733
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:015f492554fb8bb6465635fdf3c2b27b74bc54d6a2cce6db9620ee644d226d36
Deleted: sha256:7dc3a5bdf99ed6265b3bad4a618d12a627a86c670a2ec0788d7ad5ca8122fd89
Deleted: sha256:117c3a550336be147737d97333547d71d69645654569f527d94894686558a07d
Deleted: sha256:9f07a2b86b60f53f776d54e0db63f8823481c816c57930aaefa866b41bbb838e
Deleted: sha256:e1906d2c523d98ad056a79a440f70fe4c32fd7f01fb8afb65f0c580e2cdb7995
Deleted: sha256:4bf81b465ea38cbe85551adb4319242987ffca8213e8e32a8aeef0363613f1c2
Deleted: sha256:0f846e210b4b40ce00c19c30a202976c1577169b90b5798ab5af55ba15f186e0
Deleted: sha256:d90e70aed17544ddd4537d3523bb965b6cd45030e66b00f4195b622fc980db68
Deleted: sha256:3db7bad8884ec31c3bf44887c8810392aae92ae6cd6014600b6306e81ec074e4
Deleted: sha256:01d1a8ce09f21d15bda98986f5dfc3df2b63009a912d033730388d11b63b11ab
Deleted: sha256:2e0baedee641bfe2c551078e530aea9fc4d3f7f6ba0a7cdd3af8823567aee392
Deleted: sha256:01def450d5642aaff856c2fe2ef73f45c901350478c36830dea00f6cca5b8fac
Deleted: sha256:aaa39830a4d147cd29622512c0a734b6732d7208ad6f9585dcd3dec355961eb6
Deleted: sha256:919d409554ab6cda4ec9ac390b627a5f902b179611670a547b1aed61e316eae2
Deleted: sha256:ab9fdb5142a35ce6bcd4ccb629f30e7c8630415a0dba27aaaf5bf6863bbe70bd
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:015f492554fb8bb6465635fdf3c2b27b74bc54d6a2cce6db9620ee644d226d36
  Associated tags:
 - 20190701-164733
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-164733
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-164733].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:015f492554fb8bb6465635fdf3c2b27b74bc54d6a2cce6db9620ee644d226d36].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3693

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3693/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7640] Create amazon-web-services2 module and AwsOptions

------------------------------------------
[...truncated 207.92 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_05_58_42-169823318733672031?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_06_04_50-9698400765863936851?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 750.189s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-125044
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:4eab22c51501a567bd918962e017750a9de2f601d2fe05b38edd68d237600c18
Deleted: sha256:7dea568ac202b30d4d0f93c2d0e875ff4d7a8d328b982c3ab7fe01e02635ef1b
Deleted: sha256:e28a2d189728d503bc2e8e60137a8e0188f731c70ab9ef91fa8ad657c480fe6e
Deleted: sha256:c04aeb11f826fa941145ce8153a90975a862b6ea61d97a852d707bfadf847fd9
Deleted: sha256:2619165a41349f0f8b8d1e7e2da55cdf44323532f553e68e76913e29c7d52059
Deleted: sha256:cc533d38b0e75236e9525fb804c0921cabcc30910ac51c9011d97a0937699f63
Deleted: sha256:6e9a16e8bc450cb5dfd3abd87a381fb2b79aa16a77954df9344e79c514947697
Deleted: sha256:a78c63c088ebdcd30c2c9f99a0744bd6f3bcfe7d082a85c84cd4ab8a9c651f38
Deleted: sha256:fbde2cf2a721f2651506f16b6aee359d34e57ec550d79625d5eebe18a6878276
Deleted: sha256:8d703206d34ad8dc5fa224b655f7c9c624c018fbff0bdc268c9888a9066bb3cf
Deleted: sha256:2212fa38974116fe5444dfa94518ead15ca5db603d21170c85f4ffdd38d3bb69
Deleted: sha256:fbd649927d7f00f3abf4266d05a29a296cc2ceaa10789636fe5256ee72a52533
Deleted: sha256:2e153d5d5f9676c4df23a469971dc0534b2e312caa6166b3a6960a18def0ff5c
Deleted: sha256:5b5648439e4ab87b16de2d4931bcc6c65004b354548c6954c59beb9de807cf0c
Deleted: sha256:1c506145dddd529b9c65a74cdca0a46b768f5c51f1f8faf6acb15033149fb529
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:4eab22c51501a567bd918962e017750a9de2f601d2fe05b38edd68d237600c18
  Associated tags:
 - 20190701-125044
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-125044
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-125044].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:4eab22c51501a567bd918962e017750a9de2f601d2fe05b38edd68d237600c18].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3692

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3692/display/redirect>

------------------------------------------
[...truncated 208.25 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_05_07_56-13902975674994164603?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_05_14_34-1913315805920410296?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 738.637s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-120010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b6f6203e679c1cce3436be3cce200c074d7da244f75f75fddf4a6bcc4303d6d5
Deleted: sha256:79898cfb79cffb95443b3a2969d1622f9269139da61ff390737b61450b00c7d7
Deleted: sha256:01322913b6155222fe4ef771c9348f78d5c7f600cc0cb66f06a471f82acfa0e5
Deleted: sha256:07513602fe31eae0f2d491b815ec72802a9451108125739ba79da12c28d66c77
Deleted: sha256:3958fd62f23278f8e442bddadf3f2c6b2badddd64e7815cbc87e1dc173ffcbd3
Deleted: sha256:fd7876993a8587d2dd8d363617e0253fa056c27be33034b0803e2b889bdde770
Deleted: sha256:2fcc3d8a3b1ef8c0446b67f4b4f92743b69868eef544151b77e25ad8b9f0ccbd
Deleted: sha256:e039314bd5835a42c1a6aea01d4339ca5309a53461d5551a6813f1aa9b5ea7f0
Deleted: sha256:851515b6f67ff69996ad66c007019c37a2b6b66db41ffb67c8c27b0fc8d0d2c9
Deleted: sha256:12b0ca85889e17951afbac078754681da6a894ff6959428221338b6fe4b831a6
Deleted: sha256:77ecd26fb56b329f975fc8e83e7a7ba479c359bfb6e491e9be806b251a859ec3
Deleted: sha256:b820b70b6528d79d3c2cb494630ee45e39c6afd7cb8236a41a32df70e3f3abe8
Deleted: sha256:899fcdbff0d432c61559c40a85afbad7cffb3e367d0d2c6fe80e5c71044c73a1
Deleted: sha256:afb8486f5c28cde0fefc1883e924e3baf09d040ee0a16a3ae53b2d42c960b179
Deleted: sha256:bd444eefcc4ca89bcef6838740aa4628d670d39a7b1f105f34c63b7c420f80a7
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b6f6203e679c1cce3436be3cce200c074d7da244f75f75fddf4a6bcc4303d6d5
  Associated tags:
 - 20190701-120010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-120010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-120010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b6f6203e679c1cce3436be3cce200c074d7da244f75f75fddf4a6bcc4303d6d5].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3691

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3691/display/redirect>

------------------------------------------
[...truncated 208.24 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -410: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-352\x12\x04-350'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -410: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-352\x12\x04-350'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_23_08_37-6069273861588078448?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_23_15_50-12506290409114216380?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 804.924s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-060017
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:4cb97e29a64290f29cc142952358a1dbda898e7b4d103fdb8d721829b8eefe36
Deleted: sha256:cc283dc49f8cbfe108449f5b7145317e40875b6e2c6728dc5fef00bc7089a5fb
Deleted: sha256:7dff4bceee8b700924d7038d72a084573acf0c083cd6e4c411e28024753adc39
Deleted: sha256:a29fa8eb86f790fd31c20fc61696cc507596e22a88845b83cea278d87d9ea47b
Deleted: sha256:dafe22c3f7edeb829a10899a495bd46fe386d68b6da3122b8eceba53deaad01d
Deleted: sha256:285db745fa45d8306958b8073186e8d1ab6703b8574d4a83b02ca3b59a98b3f5
Deleted: sha256:29b7c58dfc1decfa5278f03d2a2230a19f68ce65a86ab963dcb77b2f59713081
Deleted: sha256:970828864eed07d155dc69558cad7ff540a4af1f86fa31414067d531cb66107f
Deleted: sha256:213fd57f13cef5ff49b90252155771757ea0d5d8fa219284fc6efbf6ddcd71e8
Deleted: sha256:721eeded964e5c4bdd87930a68b675482de51cb1e10fd34ec4f9dc972f8ef00d
Deleted: sha256:4683cb06610cd7971a561d45e27fe4f5216184f99dc0080e68c799b1cf9dca7c
Deleted: sha256:9e68ea3b292e2544211c4c54a671b0e20a17055d8ef732dbdb77188431524758
Deleted: sha256:1a061c85ce09ea58d2caa2868e8f7900d55a27c20a590093567aca86aadd8ac3
Deleted: sha256:d5b3d3084815e95cdc72b200ec5c81aa450e60781cd7cd39e2d6d6c06c89427a
Deleted: sha256:bb7cefb236dd8bee10d300ec691842cc321d61fd361d16f6617fa57b6f9f6423
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:4cb97e29a64290f29cc142952358a1dbda898e7b4d103fdb8d721829b8eefe36
  Associated tags:
 - 20190701-060017
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-060017
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-060017].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:4cb97e29a64290f29cc142952358a1dbda898e7b4d103fdb8d721829b8eefe36].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3690

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3690/display/redirect>

------------------------------------------
[...truncated 208.20 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_17_08_40-5841956548398661571?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_17_15_18-9595244814895190416?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 834.855s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-000011
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:49b2308e16160caffc3a28844c416bcf4c15956f7d62fe6e2ab51ee01d96a099
Deleted: sha256:8d280d7b7add44d2ef30497b36cf2fa2fff994dfafbe5e8ec8219a107b9367fa
Deleted: sha256:cb725cf14df23395fbe76ca5c2b11f36aace779bee2a0175aeefaa055039331b
Deleted: sha256:42452f4a77b4f5cf1e99ebc515d59dd0894cd0701de4fd462c47de1db79938c7
Deleted: sha256:7bafa6927e6e1c0f19d1a76230c63357f4b8c02bfea9b2c21337416bb9f736de
Deleted: sha256:a2568e9505cd398d11fd2eefb0dc8e47376b198115d636f2dbf6027ba5dcd4d3
Deleted: sha256:eb65768305fc58bcff9ad53a6268dce30ddc3ee7556c485ee9f5542ca8d91fb5
Deleted: sha256:d73d873432e63818b7a5d73caa6ad20402b2a39c89ad02edcf5771b06a0c7244
Deleted: sha256:584d02c9b4159e7cb33c6af2c4235622a1027fecf8656f2d0a4009218886d54b
Deleted: sha256:ebe490b933ee76d0b730a58a8a8f6b079440c56648c2ed6058b02378f025fd7f
Deleted: sha256:deb85116ef90b17cbae518ede474e398aea13eaea1c55bac732534bb0b500561
Deleted: sha256:28eecb78824dec7e485c349d23ac82381cb16372d2a71802e7817d9bd2e32621
Deleted: sha256:ddd0019b44082950ab776340e30cbdedbe0c7ba60bd6e9666ebb338260ca49e2
Deleted: sha256:3ed29f0290107e00643f393121816f99cb78591ad6dd1811b0b0119614fa83f4
Deleted: sha256:857cf3e906bf9cdf5dc9860d861e9aad6fc9fa31b4ad80279326f1dd45697b54
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:49b2308e16160caffc3a28844c416bcf4c15956f7d62fe6e2ab51ee01d96a099
  Associated tags:
 - 20190701-000011
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-000011
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-000011].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:49b2308e16160caffc3a28844c416bcf4c15956f7d62fe6e2ab51ee01d96a099].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3689

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3689/display/redirect>

------------------------------------------
[...truncated 208.62 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_11_08_29-5253770123226336625?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_11_15_47-2551620475198357571?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 794.597s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-180009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2ef195c487ff4f93760706ef9cf8cc8840fc253fe7f32970c5f9d261d09867ad
Deleted: sha256:2234642c8b0ebc79d04d0a67fce68fd209fb8e8522668d0c85ca7b75a501ecc2
Deleted: sha256:421c4488c4f7fa3509fe9cde52a87dd8c3884151015bb384b04a0ea327455292
Deleted: sha256:0500e5f912c89dc27f70a8d6b9e03fbad8805f76cef2ad9046ad3ab677aef17b
Deleted: sha256:da435bf1bdde69e3fbb7762bd6ebda71bae8a5c7d8c2d0471396fe991be454ba
Deleted: sha256:66110d6ed9fb752f5cdb1233016342452ade9e2bb6f5e68dc599a602534e044e
Deleted: sha256:10a45dde525132ff02cf54790395ac73570d0deba359f2e982ebfdf0c82dddfd
Deleted: sha256:1ddb871996ca8f4a8cea6d582d8b91961479ac64456eb0050036b8b64128c646
Deleted: sha256:751b79b410104222339f1831d0c5c354b87ac495b3f9d33352b20ca4afe65b92
Deleted: sha256:edc8ff7d38802d1743ccce08626763ca113943c7d8ec676451f3e7c1047fa2f8
Deleted: sha256:040057ffaef94275b25b7cbf5e9c16d7eccdb881eb4f7732d644cfe5ed885d29
Deleted: sha256:0d6d07fcd6e7bd61cf345417368858f2accf455eea10e2ad788cd686f252c2bf
Deleted: sha256:e56c2b9bab469f0a6501ae47606b6e142e8672cc684f0505f4f153dcfa51e9a7
Deleted: sha256:abb2cef2260209c42e8f5c2c858594492870d8ef029f09dba0212bc00a78123d
Deleted: sha256:ada478ee3cfb5d90a081b778eaa49f511d7ab0ba9d6aa71f4414d809b55648c2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2ef195c487ff4f93760706ef9cf8cc8840fc253fe7f32970c5f9d261d09867ad
  Associated tags:
 - 20190630-180009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-180009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-180009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2ef195c487ff4f93760706ef9cf8cc8840fc253fe7f32970c5f9d261d09867ad].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3688

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3688/display/redirect>

------------------------------------------
[...truncated 208.59 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -349: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-291\x12\x04-289'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -349: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-291\x12\x04-289'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_05_08_29-17554109334196484419?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_05_15_32-13033466020270711883?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 793.842s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-120008
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8f9518f5bb50d528c19d7913d246217eded7ec1bbb65744947fe9afb56aff9d
Deleted: sha256:1b163bbe89884df7073d09945091a766bcf89388741ce97ca27fcbab45e1e9ed
Deleted: sha256:0c087c07c59ec8744d7f5618da21a8d0747e69cfae4a5111cf23efff2bc05c9f
Deleted: sha256:406fe45146c32019cb6d06834a3bb1b3d6183c0d12a437bd2004d70616622b73
Deleted: sha256:7060d0eb1a39571473025cfb32b3d2e35fe1340ae91bbfb4b92b74aa821a86ee
Deleted: sha256:a6f31e2b83be5fdee6f226feea3646f45c0a0e571e47130136fd92b93cfdd736
Deleted: sha256:613919269b4aa7bf4dcd96eded77d097e1c59541555b455ffd664f1dc1a1c9c1
Deleted: sha256:519116c01671b83f7b2dba28271471cc4e51a01074d647aeb24f74b59c92e5e3
Deleted: sha256:ee75ff856e5d7718f1c2cfcd05eecf55bf3e8a84f463a5d10f65c5e46c7cb8ff
Deleted: sha256:8b7bf44bf2f8dae26a624833cc6bf6989d0b45ea25bfa7afdb2327ec90fba416
Deleted: sha256:9af6b9bcc37ef10be5ee12203c92409a0fd14e33df6abe1076175d5dedc7069c
Deleted: sha256:77df6ec1cec8524a042ebada7c0125c133f482dce156f49cb4f0939e9819ead1
Deleted: sha256:807bd697f45914709eb43c37998232146bd8fbfab1826dcfb59cd43235c4fd5a
Deleted: sha256:37bd919f30d0cad77057f39f0019c3f9515fc0eb1ee8f4b83eaddb92a87e80a3
Deleted: sha256:287ee5eca295198539fa98161f9f979d6116ea20c2daca448c294f12a4135706
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8f9518f5bb50d528c19d7913d246217eded7ec1bbb65744947fe9afb56aff9d
  Associated tags:
 - 20190630-120008
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-120008
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-120008].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8f9518f5bb50d528c19d7913d246217eded7ec1bbb65744947fe9afb56aff9d].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3687

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3687/display/redirect>

------------------------------------------
[...truncated 208.36 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_23_08_29-2002597637765519090?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_23_15_12-3517421203121459086?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 830.199s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-060009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a3855f12309982a94a3854ec8faedeb047708b0a2481177be6409b3b205c627a
Deleted: sha256:2ef8c98b05d39ed703b6e1792ac94a19bdb48f01be1b2b32f24888daff118d5a
Deleted: sha256:19def2b7b5e159a3dbb341a037420809c9390cfc97ce7a42cee837efed8b77c7
Deleted: sha256:80a22d48e3c03d925960ef3f4fa127c203ae292535c6e87c17dff05c3c3b51d0
Deleted: sha256:8babd2047cf465150deb6240b4c29be9aedaf0099593d533ac7817d04b4a4902
Deleted: sha256:d8a3099abd1e8b36e3644929d64c3fd636faef44c708c185239ae28e3fc1ca75
Deleted: sha256:a7db2e25b7510a284c2c761a4e8a34d020509f9ef9193be08553b2b53c12c5c0
Deleted: sha256:9de632465b7f936ea9e48c6fedc0c12e9d1b0aa4eb2e3cd17b283fd8bbfcd562
Deleted: sha256:71452f721f262f28f9cc217312fe9c46b793c76cc1b1b0d4d78b6692e50f29c1
Deleted: sha256:8e240679e39cdfe5c97e5ad9be1e7fb33e29210c64d0a9259d77a598af198217
Deleted: sha256:fdf54abd1ce3860d591527fdb2dc1806cc55aea8bdeb59648ad8e2101c6e8d5e
Deleted: sha256:e713d4a9f800ed4c86b7299ac5e0f82ca323b19bdc8159e5311fc0360699c701
Deleted: sha256:b36c712ceb90bc381425941a73d071189ca3335161ff5d185231fa08dd4f0c99
Deleted: sha256:0554b88af58f139a964546a9b02b5d1c0771b963b41265b48862521271740319
Deleted: sha256:548162a9c9bf73778e824bebb8417766ff76353547bba6dfc340ff775bab8c05
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a3855f12309982a94a3854ec8faedeb047708b0a2481177be6409b3b205c627a
  Associated tags:
 - 20190630-060009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-060009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-060009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a3855f12309982a94a3854ec8faedeb047708b0a2481177be6409b3b205c627a].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3686

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3686/display/redirect>

------------------------------------------
[...truncated 207.93 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_17_08_43-2807750522873383878?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_17_15_01-731854156366602360?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 758.946s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-000015
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9d9599530fec7001b13bdb4789d13e34a2eea04ce306a0b434fe171fae7cd54
Deleted: sha256:6142546cece44837bdf5a92aeae1dfca8caa1f2a3a0506c9d99ffb3f7965cd92
Deleted: sha256:ba6eb2393fdaa442f84c606261148e50b1f87b728cee5fd6494afb2a177b2aa0
Deleted: sha256:d52388dece9e3db1e782aab7c2034dfef41505602694eb613c902cb8daa39c96
Deleted: sha256:92ac2ea9638fe0717a7bfe387a74577c90a94013b763c5093737f3c1db138e5c
Deleted: sha256:fefa1e5bc6d0a2715903594fcfb16678c6aab860b7194f69b080b6d01d263baa
Deleted: sha256:bf68c0a1121336086d1ec8527190c288078fc18673e6476087626d011c14a9cf
Deleted: sha256:008b50e31791ced785cb62b0aba24b1bdccae362330588fe2e922bc7cf49caa5
Deleted: sha256:11b32a67bf53d0c04fb74c7fdfe3cb59fc7e1805518fd63b5c800f55f43f26d0
Deleted: sha256:0834423f3cd6d8c5cbb0cd99d89d2756227ce395d0341ae1d1d2561d0f6436fa
Deleted: sha256:c6d4df0327a5bd7812e4ca5e75c782723b3fdf7a4df04672c1d9578618f99446
Deleted: sha256:347eac95c7f7f4cad564ea296a7b81f8161c2eda0d3be2b189677b2407ec3718
Deleted: sha256:7eb0c4adf83ca4b4bf30a0cd33ff21669c6343c9db466c2f070c64a5df06fdec
Deleted: sha256:5f449e888efc951ee99ebb5410f14819b5fa8df7d4376a4ad1652d41f7f35e10
Deleted: sha256:f4502c6e146dbae32e768de61ad7d12c2f3cdcbf43bd85115555a83a2d57f135
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9d9599530fec7001b13bdb4789d13e34a2eea04ce306a0b434fe171fae7cd54
  Associated tags:
 - 20190630-000015
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-000015
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-000015].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9d9599530fec7001b13bdb4789d13e34a2eea04ce306a0b434fe171fae7cd54].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3685

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3685/display/redirect>

------------------------------------------
[...truncated 208.51 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_11_08_33-10210262735420284253?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_11_14_26-8911216390363898068?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 743.682s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-180008
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:0c5e96675360c16d6d262670c8f2be36573be1de30abe9278f1d92e78a73caf1
Deleted: sha256:473dc312a7f204b71d2a26635bdb54587825330d00114f78505fb15bd70ec401
Deleted: sha256:29349ba62bf7b494130cf0845db0414c58affb3956c4e4d1771298b8b13e0bca
Deleted: sha256:e0c3ac03ec759e65783053e349b7d3569a0b6a9b83a689caeb7a0309bc514672
Deleted: sha256:cda7b90483cf30f140fccdaf3b4b7c2da096f039f0f1df16ba9315e39a3f73f0
Deleted: sha256:9f0a955b7b793eb024cf8a95656d433b77a07f8bbd4f6239404bb731a3b55f72
Deleted: sha256:0228b77f253c2dbd3fe0cd923360ff2dd220cce2109aa2ff5cfa3948d1f2f3d1
Deleted: sha256:cdfd09f8e42fe79f1392f5466120abd96e4cb14851d95cc63b164f5292abac2a
Deleted: sha256:0869643c6384341b8e2ae6efd4dcf9938c1e5753d2f1d0b819b7c2b0264ec68a
Deleted: sha256:9d4cec50aaaeda2d6fa4e635caea367e4ab7de18557f0a2cdee3c7a53809b6d5
Deleted: sha256:597dc544910cd44f766bd294377811b74dcb6dd1644715db85916905f0382b5e
Deleted: sha256:d0dd8f82d343e10944a50578c6e7e7d1cdb18f541e520ef51bf2e11c55dbbc84
Deleted: sha256:aa79d0499b5380729b754c9a1d6968740fdeefde849fac9900e5a8302d995555
Deleted: sha256:9a830b1d6eb8373487e362111aadd522dd5b396138b3d12ad2fe12a192d33cea
Deleted: sha256:7b303a5b55df83cebe5ef4a0361ab5bb9c24cf424f5dbcd669443b02133b823b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:0c5e96675360c16d6d262670c8f2be36573be1de30abe9278f1d92e78a73caf1
  Associated tags:
 - 20190629-180008
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-180008
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-180008].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:0c5e96675360c16d6d262670c8f2be36573be1de30abe9278f1d92e78a73caf1].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3684

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3684/display/redirect>

------------------------------------------
[...truncated 208.23 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_05_08_33-8824590905822626715?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -128: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-106\x12\x04-104'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -128: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-106\x12\x04-104'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_05_14_42-9343493441872925995?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 719.867s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-120017
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a51212f609a165a248257c36158170a34a8633315ec2d57a959625927c370337
Deleted: sha256:dd49020b9fd39677d124c29fb33d9ecabf3e44019d4f397aaa6710c25874f59e
Deleted: sha256:eaed81ab7893df8ed74dbb39dd1fac7274de0015237c79e403c81aff0bed469d
Deleted: sha256:d3beb4c544c32e60fbfef36c2c03066ce49a18a6a553b3ba91d53368188cf030
Deleted: sha256:db26aa9d6290a7ec187c475bd96d11ef037d8c9a039eec2f47558b40c4bc0fdd
Deleted: sha256:c78e755d37e32b3aec2006bd7392ffe7475444998262b2710800c467e1b5598b
Deleted: sha256:ff7df4e9a4c3cf5488ad42e93b39b8a900aad69e860aeccfbb2ba4668a28f486
Deleted: sha256:7429b99f6f24414cfce93a6c9942471e68874fc0eaa335cb770bcdecdf1d22e8
Deleted: sha256:7863bc062b6ad4772013a54142461ae3fe7a637f9135597ca83fe9fe090f3ebb
Deleted: sha256:9e0270d481784bf23b1fb825d9d3e7a001177a39827e0fd9e8bc1a71fd0ea855
Deleted: sha256:535cec9318e4eda51b3b675dab1a90d262da8abf1ab477652b9fa41440cb08ce
Deleted: sha256:3c4b6050660e522871aa5a9c6403d8b8e3f5dbc03596a058710b2c5606b4357e
Deleted: sha256:3baa7c274babfece66853f3dbd5944eac3f4a3d850226cc7d5913f39eff8cf2d
Deleted: sha256:5df7baf57a71332914f8a57757a34c31ae8d1fe1ba8254528848385f0d7fb4b4
Deleted: sha256:05bc4ce73698a853731815326d0380ba5227e6127e25477ba53bb9350e172575
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a51212f609a165a248257c36158170a34a8633315ec2d57a959625927c370337
  Associated tags:
 - 20190629-120017
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-120017
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-120017].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a51212f609a165a248257c36158170a34a8633315ec2d57a959625927c370337].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3683

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3683/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6692] portable Spark: reshuffle translation

------------------------------------------
[...truncated 207.64 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_00_36_16-7846030029129310847?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_00_41_54-2307305936616572146?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 674.918s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-072819
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d0a6b86f3a3e4161e5cbd751a854393a230f732182370150216be145af69f95
Deleted: sha256:f0f5f214a3cfadc0cc52b475f63b78df17280837259beab5932eb295ff13e4ae
Deleted: sha256:35a57d4715d400c344a508a0a93f034f169c40ef2e4e24df3935b47694133e8d
Deleted: sha256:02cc1b070daba51364f8ddd7b0eeb73901c88069175cf13a44d8b3a8f4ccae1c
Deleted: sha256:7256da4bf673bf23dfeea4b9370b6e61b3af788852e073fecaf8b42c686b08bc
Deleted: sha256:3877d3cd040349f2737e9216b65fd6504bfb58d513d83df97bbee3b77d9049ae
Deleted: sha256:519f166840ccb70acd8ac6b665e29a0a3ede759a308078f662dfc639adc5983e
Deleted: sha256:cdcc57d4f7e98721d616bc8d689251869bb9361400b04a1f4fee3d4fbf02f8f8
Deleted: sha256:2dfd107509140c03924497144bb644349931a0a18ade2158aa31fca6465cbe57
Deleted: sha256:7abc7ebdcaa512fc3ca1f43c41a605cafa68e4ed3e6a16d83df4dff32cdd2249
Deleted: sha256:4a7ebd75b3857761a81a0b01c9e70b5555cbb98e0129390ef3ff780b43320f2d
Deleted: sha256:70f7ee1152e7eaf644f613d1e30c896afb3efa0781f71de73920ab8eebb680f6
Deleted: sha256:1870bf578cb8777bd2f3970b7d741e72c33885a75a3b634cb3013d6f2b189f30
Deleted: sha256:c1a78df410d49227ea9f5011d98195ec48b505b63648b3699cc1f98fa5a7515f
Deleted: sha256:1f6388ce565e7a0d7510cbe785d6f227de08935697370a2a56f83c572851c9a2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d0a6b86f3a3e4161e5cbd751a854393a230f732182370150216be145af69f95
  Associated tags:
 - 20190629-072819
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-072819
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-072819].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d0a6b86f3a3e4161e5cbd751a854393a230f732182370150216be145af69f95].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3682

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3682/display/redirect>

------------------------------------------
[...truncated 208.14 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_23_08_41-13780641267024281635?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_23_15_03-3709687458352547359?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 754.436s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-060012
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d44ada53d7b584b89d192a8a0872fe655d57d5ecf69f92dc5600f85865d90589
Deleted: sha256:ea0a9e49c6e684e0b6f9fc90cc39d0b4ca9b22d4c4a863d3e1556cd51cbfa795
Deleted: sha256:398d29eb4cf69f6c0aa873ae73704c7cc39dd8d45c58cb07662c18800a6fb420
Deleted: sha256:0e4e5d5afab969e47788b4bd4303affe382f492a70c94ee4c272f0452ea02e1a
Deleted: sha256:ed6aa976cd83699b4ba5e706566f281f5343a4e364ec59ad3a9a641ee0ed552d
Deleted: sha256:1491343a1f831f7f70092978789f18399a666b8b47ea8f9f21de331f34108e24
Deleted: sha256:ec389742a3c2f48d4223864b7fcdaad92d6b26c8eec39c0229ecda04c95dd55e
Deleted: sha256:217732b8b58a15d915d0a6abd2c3e4e017da62a091cf6684d39c401838add5e4
Deleted: sha256:43b9555f4971297aa33eace2e4ecb91a1a17276e8d364093839765be9cbf4509
Deleted: sha256:a65b0970faec1e74ada4c287f9e76f87caeb58524d24edf53fa17b2404761bed
Deleted: sha256:3ce1d68e112bd1cbe06ed8773abbd076da27a51ed08736bc8089d815f5e836bf
Deleted: sha256:bbd913a32e82b0c79e178e034de2e23133fd062d0da98468332b433d8a1797fc
Deleted: sha256:c45d71ccf3b8bfe66b898c18c48f9138adfd417e040ef07a511206c9b9ce55ea
Deleted: sha256:07f06458d651d63617dc0e063d7f4f7c8b39fa383cdeaebf0309d00da34712a5
Deleted: sha256:a868b980bb3a89700238b220ff56c1ebee68a99525710898d05c71eb4ed7590d
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d44ada53d7b584b89d192a8a0872fe655d57d5ecf69f92dc5600f85865d90589
  Associated tags:
 - 20190629-060012
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-060012
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-060012].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d44ada53d7b584b89d192a8a0872fe655d57d5ecf69f92dc5600f85865d90589].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3681

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3681/display/redirect?page=changes>

Changes:

[heejong] [BEAM-7424] Retry HTTP 429 errors from GCS

------------------------------------------
[...truncated 207.68 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_18_46_39-2108423113879928544?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_18_53_17-15804849240340790604?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 743.631s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-013927
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d6d847773c3c0c3c25596d7e4f092ae804987913479a7fdea30ac7708fda271
Deleted: sha256:8e9fd848475b5a4811ea5bc06b6f6abde6797bdee2068eeabe24f30ff75a0ab9
Deleted: sha256:d35fd2431b8378cf9166a1cfa905e579c6a1277f73ffdd108ecf008727cfc149
Deleted: sha256:568bc4504211e76741015413fbfabc7fc5907d9bf1d6e167b908a903ec9ee3fe
Deleted: sha256:22b887b7f7677f276ca3f41457c732abd5705fe58279056066a6b37c0f654e5d
Deleted: sha256:cd35e130abd4dbc5a0c2c55587bc4889805250e4c4b9eab91392af98162bfefb
Deleted: sha256:f4f57d019d478f2c0703dc7bc0f92366022ad4d48c6f912a2e9ce16440ef0963
Deleted: sha256:618c49f729992d3d3ebd0ce44612c9540c67528f5ead9f890af43ddb44b1894e
Deleted: sha256:75e00a9ef6ad0a1ddac65b29bf4f9f271510b324dd57119056f5c3a4473ccb57
Deleted: sha256:e56fa303b7f950875551c63221a613773c67105733a2e51862aa6e32be173ea3
Deleted: sha256:57f52cd1fc39a6446225c14c52d2ea3d131af50d7c68701c650d5302bf315ce2
Deleted: sha256:ed24240ae6af1fdd9f417522e863c5b868c06e0dd9104b462e35a06a14393740
Deleted: sha256:276c4f02e3c33d4900d6eb16051063984b0d8e8249fea1081b378e5d21f7ad71
Deleted: sha256:33514e1a9f69599598750415461236df32b57f18f7f8db703ac6a6a8c83d2eae
Deleted: sha256:1a9442a2c1523df94465d7bf5c98a5135d03b035dadfdf61e071f42a59a00490
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d6d847773c3c0c3c25596d7e4f092ae804987913479a7fdea30ac7708fda271
  Associated tags:
 - 20190629-013927
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-013927
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-013927].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d6d847773c3c0c3c25596d7e4f092ae804987913479a7fdea30ac7708fda271].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3680

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3680/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7589] Use only one KinesisProducer instance per JVM

[iemejia] [BEAM-7589] Make KinesisIOIT compatible with all other ITs

------------------------------------------
[...truncated 207.78 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_17_10_54-6122378245854945616?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_17_17_32-5116264454266151282?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 824.391s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-000235
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:9f1821f219e838515c8683905a78fc74bb729c38206afc025b05aefb3c332d8f
Deleted: sha256:84667fc415765595c1dd1b84fe039a29ddc9a8d2bccba62a0e5512da09093e0d
Deleted: sha256:c15bcb84931195b336dfa6194884d7ec24bd5e957cdb34f536fa4506fd035eb6
Deleted: sha256:d813b791dd21ee059e847c556e79855919148db2563b601ea201daa25f6050d7
Deleted: sha256:301df43c05f82904d6f3a2076a91927ca62ac476196fb22bfcb67b0986f0d433
Deleted: sha256:a87f8d77f78e46efbae9eb86b066ab3745997a58c22cad143a1b29eb82782832
Deleted: sha256:c5cf1681af6775fa82b1f9307b42e48c51907e645a68057f7c518c3984fd80a3
Deleted: sha256:9f861540808601eaa4629ffe24857f4d891cc82d3124c977307e267121f69fad
Deleted: sha256:e3aad566ee86167cc9ed62962005d92eb1194f32a0b5a74f47b7cac94a29ae85
Deleted: sha256:dbfb2336d9e052c6ffad449df60410fb1e2ff9ae48b84a0315f80978eed583fe
Deleted: sha256:afd72826fa181a94745a73e5f1b3301344b874b26757268d804a7bc56bc06336
Deleted: sha256:ac62eb7d85597edca78494a2c4c5aeb3285aa229144c9333a6012629198503a0
Deleted: sha256:c10d48a861433735bf82bfc4b7cfdfe238b407f5ad16e3f24320c1714aa063c8
Deleted: sha256:d647ac539f491372e209e9ab0efec064eb54833d7365e5679ddf0a49bf3ab270
Deleted: sha256:ee14d3a1541e2bc0fe6cc9f9c7bf2f21dc9b44001056b1a68af694806e34f00c
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:9f1821f219e838515c8683905a78fc74bb729c38206afc025b05aefb3c332d8f
  Associated tags:
 - 20190629-000235
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-000235
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-000235].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:9f1821f219e838515c8683905a78fc74bb729c38206afc025b05aefb3c332d8f].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org