You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/21 06:19:47 UTC

Build failed in Jenkins: beam_PostCommit_Py_ValCont #3621

See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3621/display/redirect>

------------------------------------------
[...truncated 206.64 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_23_08_17-6060500074474696161?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_23_14_15-13371229141806641658?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 689.081s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190621-060008
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:296b6136ecab49ba12123647888e218339eba20bb3059169d095a462fc160db7
Deleted: sha256:0b7d9ba16a066ab9a2eb45ecffd257ea4108381c7d2d13d1d886f0d1d9169b4b
Deleted: sha256:daff0d3a7ca8521a0198a9ab3ae25b8448df5c7e59b09fef0f4fbc18da684cb0
Deleted: sha256:b527f0741ec88c23409f61f4cc5044640db2fe731099da47a9c51acf765a34e1
Deleted: sha256:b690d389f90eec1920c3ad8396fbe839dbb5910f2fb39895e79a2bb1d247d246
Deleted: sha256:abc751a8ef8ff44e771f65e2d65567a950b4258c4cdc8a24ab1882cceab76c59
Deleted: sha256:0fb89f4adef75299a533c734fa70ad77c02a6ec0a8c70e34fda84fa2f7e9ef91
Deleted: sha256:c3c291d339d8552d9bd215b7e94edc965fef5c9fd0129107849b2ce1aa1cb621
Deleted: sha256:c128d80200b6039325fad6c626b610d014f106c791bc8f288c204b4fe217b16e
Deleted: sha256:c1e4a10a66d8a77c1216b596bad5fbb8039fbb9fad464f39855df9d295b3bc15
Deleted: sha256:6ffec123d1ac5e3c37d3c17facb10f231ddf1fc21396d28229b87060fb480913
Deleted: sha256:ff9b32e842bf439b2f9aa89d120a110e4dc344a26918d7b03eb6192c9a462b13
Deleted: sha256:e18f7aedf84a236a6a6480914f0bbe8e4cc65b065cc360045c42a053516e19ef
Deleted: sha256:ee3fcba6889bcab044e05294b92d40cae7b8e4edbddca01effbcbdf16b5b617a
Deleted: sha256:530ab8d74617dc4a7c9d8e5ecde5758d05e56de2812396e2792306de4849a469
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:296b6136ecab49ba12123647888e218339eba20bb3059169d095a462fc160db7
  Associated tags:
 - 20190621-060008
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190621-060008
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190621-060008].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:296b6136ecab49ba12123647888e218339eba20bb3059169d095a462fc160db7].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_ValCont #3720

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3720/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3719/display/redirect>

------------------------------------------
[...truncated 208.25 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_08_43-3863655898976076169?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_16_11-10906469653674380520?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 815.020s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-180011
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:7804478c37ec3682df3395abf43ca0fcdcbbab78f2feb615de1a20a3d667e019
Deleted: sha256:cb253f66a63544d9392d17636cd262a7fdccb4cb2eff58df2f7eb4017103ef7c
Deleted: sha256:4973a15bb50f72d3aec3df7fdcebdb7b3dd385e8672faa0ad29e8474f5936929
Deleted: sha256:c26a20aa35ab85d43a986ff9986ce40f8a29465cd312569a32cb3570960a60fb
Deleted: sha256:348e925365134bad5d6ad48558c070f12ce8c658e71dc226113f83793cff33a2
Deleted: sha256:350d221d21cc8e909aac90195353036d2395516f655df5f428eb959264d704c5
Deleted: sha256:40569dad9f1ab24247992ea25f4ff6782b7f141c844307fe0df75b5e8fdefea8
Deleted: sha256:29afd2c6efb5835ffd92206c06dcef3859814c4e2b79d9ed46796cbd610e2de1
Deleted: sha256:5fc3f1a3e6c0aee7e160b6523c6ce502e6c81f862bb7b0298efbc5082f3ee6b8
Deleted: sha256:291858aedc63bf7d47fe698693fa38c1251b55935b091443b817b2924e979ef6
Deleted: sha256:73457211c004f14c2f937b67cb9f547bc3b85d83acd3baa26d3b77799652a398
Deleted: sha256:7e13e36844c83629026b086fc959884110381367e3e974d72287a191a36b3664
Deleted: sha256:92850e9324724d46ce0663a81620112e66dd82804c908e148cf0420606f5514c
Deleted: sha256:2e9b34d761c0882b3a58ee9a8465616709e7996b78a723f3a58b31a874bf26cb
Deleted: sha256:841941b37ffc941b8eaf3f3ef75cad1cf3c430772d3ce6ce31a7ca119cced081
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:7804478c37ec3682df3395abf43ca0fcdcbbab78f2feb615de1a20a3d667e019
  Associated tags:
 - 20190704-180011
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-180011
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-180011].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:7804478c37ec3682df3395abf43ca0fcdcbbab78f2feb615de1a20a3d667e019].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3718/display/redirect>

------------------------------------------
[...truncated 207.31 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_05_23_58-11063630514579930569?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_05_30_46-921014861769287707?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 744.166s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-121650
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8aec62bae3a1d6e6f0fb9fc06a413134ec6d508acc7719398a5c87b7927782f
Deleted: sha256:cab1138f70b3910e8f4bd1d151d0bc9c6117997c37be6e180af06e300f0cf2ee
Deleted: sha256:b8bb563229e2a24da8c04a096cb6635e0de7bc7404eb21be376b78fe65007a7f
Deleted: sha256:a291ecc029f4cecd6d448f33323e430ef2150e51d9dd38b1528dfdde55f95325
Deleted: sha256:0ad3f0a6bf131edac1cac34e9c61121e7981cb435113213bf1107b44d58cb4b7
Deleted: sha256:1d0d2b34f8533459faccadbf0eaf93c62b57812cf4943f815622e4f0c683bcd6
Deleted: sha256:c9617d316354c5027afb111779d4f601ae33f721c9b9f2b86aba4f0900353407
Deleted: sha256:8714433aed9ec9ea78833c99bd392f53db9c85185cb40a1b93afc7797fcca031
Deleted: sha256:a670dcab1dac24128fa469915f54162bb4c45742564298dcdf9fbb2a9044906d
Deleted: sha256:0ec949e0824a7ac58360027a394a430ba4e617f371fa24d448f525d6d0695d46
Deleted: sha256:64f9506411d2c8df71987cdccfd0d2fa2585fffa0e9c670016d71c2c2b7da987
Deleted: sha256:f9de546fbc2dffc65a7c49d88d6c34b48fb22bdc70b84f8e9d7fbc2013042ddd
Deleted: sha256:6fc62be8e6052b03560ac27f1ed61430ac99f68b4991a625c446b029ace72f19
Deleted: sha256:477ecc3178681f79ad9120e770189f8d321dcdf957c0fa2e3fb51e99c6f65064
Deleted: sha256:bf8aadb3bd94b5a5b26bbaed13b3e10bf34d61c88786576a3059b8914be73a42
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8aec62bae3a1d6e6f0fb9fc06a413134ec6d508acc7719398a5c87b7927782f
  Associated tags:
 - 20190704-121650
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-121650
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-121650].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8aec62bae3a1d6e6f0fb9fc06a413134ec6d508acc7719398a5c87b7927782f].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3717/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7550] Reimplement Python ParDo load test according to the proposal

------------------------------------------
[...truncated 207.63 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_04_54_22-9572408781194139436?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_05_00_20-11181944445268906857?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 774.258s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-114716
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:e1d8bce55776a09ce3de168e03530296b5f19cb6bba63751374c5f3adac71e58
Deleted: sha256:df84d9ad9ae0f9f574a68999e910afecd7770729b3b0ec48d4064e2ced5efa6c
Deleted: sha256:379a03148681ff8bcb92b7f6ad316a998ad0b5e0d5d97651c3e19d3dad2e90a7
Deleted: sha256:ee2d60bcb3c7380c0d4c6c48c70171af27be7c47ba23081822615eb81008ba98
Deleted: sha256:3cbfab870bca8eb976c646f4223c3b7655e350d88002201ad7c92f850e0054b1
Deleted: sha256:f03f9a570e245f4c4cdba45fd90d04af975508b1e5c0c00bfdf84257c1503e7b
Deleted: sha256:c0f0c6b7b4c823c4f4c068ef28348516bf426631497a428a772c23e38c1a2b7d
Deleted: sha256:e7574635e882f824b5051f10b0c1223dd4702ec281529c7c70f0cb6262614bb4
Deleted: sha256:856aa7c9c6327acd18179582e49ee1effdcb12e6f6600077f3fc82112c60c4d3
Deleted: sha256:0949cc5e28f2d7ece6a4b2d27ee0ce058433c4441eda56aa4e4d600a94e28d96
Deleted: sha256:8b2ab1a71caed205f438ba668adfa0c5ef580b18316aa6d75edf5b503ad7b686
Deleted: sha256:2eac3158bfbd480348969ef811db34feb72f205d26509c0b5b5a50b40586abb7
Deleted: sha256:a1116207e976fe1539a17e580063fa4b8c4e0c3282282900fbf7bbb30805f8ab
Deleted: sha256:9acb7472955e802cad51051c4f8860940df460158427781f1c992d517dfca9b7
Deleted: sha256:a90774bce104f2ef2d270709181e56d5bc44c4487bb43208ee299f44a8f7a88c
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:e1d8bce55776a09ce3de168e03530296b5f19cb6bba63751374c5f3adac71e58
  Associated tags:
 - 20190704-114716
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-114716
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-114716].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:e1d8bce55776a09ce3de168e03530296b5f19cb6bba63751374c5f3adac71e58].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3716/display/redirect?page=changes>

Changes:

[jozsi] Add Jet Runner to the Get Started page

------------------------------------------
[...truncated 207.71 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_04_32_26-16586542757698003753?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_04_38_39-155296506188003562?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 759.651s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-112420
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:f46841ccbed356eed1b5629b4114bb63a09f19085ef3e28560704056c14a00dd
Deleted: sha256:68589fe63ebb7215e3d4fceb05bda8c16436b4a169453789900da321bd4cc34a
Deleted: sha256:7520f0beb6eb99ec5403a8803f18b1ab87f565a618ef6c89eb555718a6e4f1ef
Deleted: sha256:bdc09b0a757163269c135e8265aedc0ac38ffba45574c65b51d399279bc96c3c
Deleted: sha256:0ca37b89f78d90d36b6945a27ebc8e43bd8f545467274f78a38487074d743fd5
Deleted: sha256:50eab9d256fb7129ee73d7f4e893115783c8b0deeb34502028469075b0cfcbe7
Deleted: sha256:ffc0a8fc3b1155d3f93f6446dbc1e9035ce490bc88c92f6b540531c43ab588a1
Deleted: sha256:38ccc6e7fc51654cea4b6c5146c0bc1bc2089046fa10c699a52d1d5f072574b5
Deleted: sha256:de3dab31d88940d8491a5d4cb8dcb9e0119312453b501164cea70cda045f9f5e
Deleted: sha256:ee524c79b73895db8d87a28aa5cf61a24b3858f4b62ec79b25e3d07802ce7ecf
Deleted: sha256:a0a4a702266ec73e019794e568e5f6992372a5008c07c3ee3eefc56566865e8f
Deleted: sha256:9df58af95e973da158f7e35f516e24986fa108e2550063f45eb2acfbde3380f0
Deleted: sha256:2bfbb153f35165d6d8d5821226b7c57fbe26fd3d06e088bb21eac2b67c3dd987
Deleted: sha256:1335be1fb21693f92342fc461774c573c4666472e316b68f87d3b980011afcae
Deleted: sha256:11f42f73ed8c51c3bfc84137aadf04e8c1ce8e23937a9c063975bae98ed6692e
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:f46841ccbed356eed1b5629b4114bb63a09f19085ef3e28560704056c14a00dd
  Associated tags:
 - 20190704-112420
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-112420
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-112420].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:f46841ccbed356eed1b5629b4114bb63a09f19085ef3e28560704056c14a00dd].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3715/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7682] Fix Combine.GroupedValues javadoc code snippet

------------------------------------------
[...truncated 207.71 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -286: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-264\x12\x04-262'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -286: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-264\x12\x04-262'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_49_55-10794539259424561296?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_56_23-2387914916096189018?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 784.492s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-094238
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:100643552cc011232e0a4bcb43d1b3bdebcea808093fe8bad3c1500cc249563a
Deleted: sha256:e819d6f78c4d30ef8b2398a7d05c05a3d6b5122eb0cdd1c1e7e021d2ecd108ae
Deleted: sha256:a2006b9b46dc9a0797e29fdce831a3549eb78bdd39248fc3e863db5aa9702b4a
Deleted: sha256:b6eb9fe1945c109e4f43d437e2bdf8da7925b0ed015dc79ae3e546b23bffc318
Deleted: sha256:b4c27a773db7e3f965b192c51680f49f6297b38c1e2fabb5dd2d3f752e0b9924
Deleted: sha256:f754dbb4ce5dad739788eaab246bf1875c03cc2e611d179c25bf08ff49d1a6a9
Deleted: sha256:2b8001b39eaecc6df83d43bd646a6944b0d5e8261e5d0d29f2583005d15f359f
Deleted: sha256:c6e3946346cdb520ea2f3327d87b2327a554a7067faf3c81cabe37cc54d1914c
Deleted: sha256:6eeaaa76d780783e1e0e87c77d530a7eb5f8097820b8c9f171051399fad51713
Deleted: sha256:a0a42389b86d4e416b58e69408ec067864330e27b3a3323b72ca420917084483
Deleted: sha256:276f74138ae8d7476b5715daae1e94e79ac6f39448dea7afab4ad94556c844ca
Deleted: sha256:6f3164c2cae5d83a31492fbfeb3a2e6e6e640dc2b6db9984b629fbe60082585e
Deleted: sha256:49ce765de428bd7fcd299ea4f2b3106c2473177b61609837e8b47c0ef733685f
Deleted: sha256:38b6b5eeb3484ac5b3d371c0004edd5754d42e06a6349c8c807075e7dc1242c3
Deleted: sha256:18be53a5c104aae25630403334d6f60722cab1a8b814c95834221e15ce0c8a94
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:100643552cc011232e0a4bcb43d1b3bdebcea808093fe8bad3c1500cc249563a
  Associated tags:
 - 20190704-094238
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-094238
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-094238].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:100643552cc011232e0a4bcb43d1b3bdebcea808093fe8bad3c1500cc249563a].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3714/display/redirect?page=changes>

Changes:

[cyturel] [BEAM-7683] - fix withQueryFn when split is more than 0

------------------------------------------
[...truncated 207.57 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_26_48-15938709806256550563?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_33_00-10256505019860755171?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 793.960s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a
Deleted: sha256:ed0446abf010b5dfb4035282a41d1b0e3938d881f44e5ea52e4ff38479de08f5
Deleted: sha256:358837b7c8131a4e0fc256da3c1172d5df8c96582f921580c24e535931b2aa4b
Deleted: sha256:a427a3ba45224aa880c139ca05ffb8178e22957d8c5f2854b1603d3ef4fec0a2
Deleted: sha256:013a877a236bb64c30270a9386a15947571d4b06be265b5f3b063669542048ae
Deleted: sha256:16f19032bf88cc67b4c7161a80949224e896b37f11dbfe6ffa801c7b30d73c87
Deleted: sha256:595a6ea07b8712f4a61627bdb090b00b52cfb1e3f10e62eb04e4c0752e612835
Deleted: sha256:d1444692228516f4454659bc0e8c4694ac27600f43adc4347e3026754c1ab22e
Deleted: sha256:47e711fab358383c37c26850df37cf3222ed38e6c93c012a9c24d5bbfaa5f926
Deleted: sha256:b4b2c9baedb6b5c96af2254ab1bda17b469e7d84daca151ac4ec9717f06b9614
Deleted: sha256:c220f87ecc0eb530bbb3fb08ffd16c8a94c9d3dd9e5f6a7749c3fcc7892617d4
Deleted: sha256:640a6d5419fc32fdce2381a7dace620cf76cfe8c384251b67a81a3e0a9b4e536
Deleted: sha256:2ed8bf7be26e6ab7a199835b1f925c48280544a628ffed8386b874d60f858df8
Deleted: sha256:f7e60c6b424ef14a8adcc19e84b39f8a19392db0130f06dec0d426ee6da124f1
Deleted: sha256:a0c262d3183cfdc2b442ad1e8e05af479eaf08455639deb563481626ce7c3ce2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a
  Associated tags:
 - 20190704-091914
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3713

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3713/display/redirect>

------------------------------------------
[...truncated 208.23 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_23_08_34-4850423435734311088?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_23_15_37-11122371123735617418?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 773.921s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-060010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:32738efe692bb8182f0bb343478d002bae1a4cd3fca95d1526cf6adb9c63741b
Deleted: sha256:c1a60af86de73580da028cf1982225731ff52a2150c2da81b51412ed6b0366ba
Deleted: sha256:ab2d5cfa83cc09af94fd519f35ebae8393696735f9f4c99ff6fd4435a619af29
Deleted: sha256:1b023a78a8c5aca65f984e6392a0a3c4811d093268ceb8f68bd9ec51e164d3dd
Deleted: sha256:77c629b108950f0d5ef53312cba2c71551df6531daf6afdd49204029ad689df2
Deleted: sha256:cefa34b6484b8ef4e23c6ba3fb8d0269a93d5b043a7deada6c809180dafc0bec
Deleted: sha256:c2a6f35ee1f99d30b8dff0da9d591f5f01eaee9250f57080dc5e5e8720150165
Deleted: sha256:a81f797c6010d0baa3935c2a12d947c973fbee24bd704dfab74ae954b6e6ee1c
Deleted: sha256:37d3ec3addac4e17dcfc21e4f1d68b68c718d5b4628aabd5a0d7e1ec306c079b
Deleted: sha256:7e350ba7c88345d4d26b054900da36be2e304cb90b03dc598fd77600c0dc5587
Deleted: sha256:f95dfa7562967de503347e8e719a1a01761bd09145b6649435f8951c4a476199
Deleted: sha256:4583d98b7ea297b9e410f05bb17cffa4e705f58a8c1669d602dbdd9b5d83880d
Deleted: sha256:bf1465d604f247d3b15c202cf2311c7481d69f52fbaba11039daeb9f68fb8784
Deleted: sha256:dc006918c84110e2f3c2b20406a2690ebcd19a3986ed5c6a8887f9c5ecab1ea6
Deleted: sha256:8fe2b235892f0fe27717bb32ad9667169349ba69dc1b5810d0db1b73735f879b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:32738efe692bb8182f0bb343478d002bae1a4cd3fca95d1526cf6adb9c63741b
  Associated tags:
 - 20190704-060010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-060010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-060010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:32738efe692bb8182f0bb343478d002bae1a4cd3fca95d1526cf6adb9c63741b].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3712

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3712/display/redirect>

------------------------------------------
[...truncated 208.03 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_17_10_41-2658235435766827557?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_17_17_35-8395085879827506245?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 834.280s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-000014
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3f40c9a4eda6cdd8c106cecc98ea0a5a0434678aed4352b38db3616ff11a6c5
Deleted: sha256:17aea3da87f682789c7401ac58ffad6e73692187a3e476049a396c3de4d89198
Deleted: sha256:5e700b728429d010caf3f963d5ba144113f9c3b34e485756300ebebe5ef05eff
Deleted: sha256:51698b273f661a7ed82c68d10c87ff6d71400253b8ea707b8a7e6600f5b5d2ee
Deleted: sha256:e07bd6e1a0df5362ba7ba571ebcf4bafe6c4fba622072bff0c14945d4c79e4dd
Deleted: sha256:607dc50022d50bde9df1ae83164ad1992d2592ea8b627485ebebc9200aa3db0b
Deleted: sha256:3812035e509dc186caa930d6ae59e3d95bb9b7c46a00d9a13390b960e2e76886
Deleted: sha256:72b79a7f40c57865afc936f191381c2ee33d30c417d89b6d583c1ead9ab84cb3
Deleted: sha256:381852c75acbd69de8a68ba301df1d2224be1216b3cbd5ddddb2e563f1d42f83
Deleted: sha256:a2bb2d28f85bbd42b218c0b978b3856422a1f510c74902aeb7cd4a023f20a560
Deleted: sha256:a7a85db47dcbfeb404f9705647c3f7a3a55b3be143fa3b93438ce6944b57fe8f
Deleted: sha256:a418379935c2b8058d4243d836f522d1d048db4d48ea56ac82e026df0dfea3ce
Deleted: sha256:5aead3e762a136e253516835e6841e645ef68202a6804a4fb4366766809706be
Deleted: sha256:75c96019fc05c7c83a35c7cbdb0f59713c9f4385de1d135b6955556c897056d9
Deleted: sha256:e1ab62dd1e1de6e2a44e1ba814af73d7f86195fd77d06bf065fd67bfa766d40c
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3f40c9a4eda6cdd8c106cecc98ea0a5a0434678aed4352b38db3616ff11a6c5
  Associated tags:
 - 20190704-000014
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-000014
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-000014].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3f40c9a4eda6cdd8c106cecc98ea0a5a0434678aed4352b38db3616ff11a6c5].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3711

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3711/display/redirect>

------------------------------------------
[...truncated 208.08 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_11_11_08-9472331082412783811?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_11_18_06-17100805536906833310?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 825.422s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-180246
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:090e589831c4d533b1e39f498b4ca0c8ee6be0a1d4a8149e29a9a83cd8947eee
Deleted: sha256:d548816b5b3225a8f2c16ff789dfe652d8778b42802adfd1d53b4fa8b1942d37
Deleted: sha256:9dcbff1d0471019ae082dcd3ac065361ebf2770510e85b3cba942960bb2f32b8
Deleted: sha256:75bfbf90640edf605ef6be451b1d5bd02e95569ccb703d314a8d000ec25c7eb9
Deleted: sha256:2d84a0e29a29f49bb508044206fe5a2e8d38f85f4a765548338867a5dd3735f2
Deleted: sha256:057022847507ce2c19c40262150fc1a7d11bed42f0b018f52032cb23e5c67a6b
Deleted: sha256:bd7bff5878ce873e4f29bdcd10f1fb3d44b5b50a40bb86ddfbc4b5de3fe54a3b
Deleted: sha256:955223f81ad47656ed51678a98229a70d2bbec2769eaeea6cecab6a1ef9c6922
Deleted: sha256:2e64e6c74805987628e58f759198eded140616ad7d048ddf96e957a1bc2f485e
Deleted: sha256:1182e850ad95da278faa83fc8bca59a5979b2292fa12d6f761146b39323eb29d
Deleted: sha256:76099c1f8be9391fd93e5b42cae1514db06e8e8e911e146f1fa17bd43b409bab
Deleted: sha256:2353f820942b29f2bcc9b0861795a2de1a2d3511d115fecc2667f3a3b4b33fb2
Deleted: sha256:a58a361aa024d9c8c6be42ffa76deb1f617b5c18e796ad7b45f2fb7f5631c5ed
Deleted: sha256:0f37ade3cfcb6aef8e487b665faa284b069495cb9b3535ec3f016c556e0b2a3a
Deleted: sha256:20552ef46dbc59f7808cfd47425facd449ad137c506e4ba42d50762f8ff1a312
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:090e589831c4d533b1e39f498b4ca0c8ee6be0a1d4a8149e29a9a83cd8947eee
  Associated tags:
 - 20190703-180246
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-180246
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-180246].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:090e589831c4d533b1e39f498b4ca0c8ee6be0a1d4a8149e29a9a83cd8947eee].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3710

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3710/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4420] Allow connecting to zookeeper using external ip

[lukasz.gajowy] [BEAM-4420] Add KafkaIO integration test pipeline

------------------------------------------
[...truncated 207.53 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_06_35_18-5030715921131329098?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_06_41_51-5852654497484027486?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 763.913s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-132714
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9a9c02482490f91222c7038b2924b63cec94f72b9523b49ec9a2ce70ecaa819
Deleted: sha256:ef5a88078787910d0221e76f5149174906cd4db0c0c52b62b0848c77b87b2f2c
Deleted: sha256:53fd83ad3aae9f1a88601cb90a53b839e4b9102d3e14bfe8c9a33e756e6d22b3
Deleted: sha256:6baaabc23049a9777cc234eb79cd4a738c5410eebee40c6dfb61392fe1abd32c
Deleted: sha256:d08a864d38673f16c2ac5e30b994671ceccc2e267e6bcc2f7361e17538ce1a69
Deleted: sha256:1e4f87c403326216a83b6c7bb78b419ad5e12300d1ca07e92ddae37ed65a3ba2
Deleted: sha256:d3f03f6ca22517dd01c49a218c0c28bf4499bb20c5ff341ef97dd196e932d428
Deleted: sha256:59f980679ba2fdc31db9f4e607de43a91c43f877ca2f56bc34a02b6ec1927abb
Deleted: sha256:a647a673869ca67628274ff1d87d7977ef7cd54479c759608aa6d9d80dc52d66
Deleted: sha256:9a7603cfa80f0219c9bc180cfe3815436127d094ad9ec6bca66b96fe32519373
Deleted: sha256:332c8a72407fd7c68304c9711d34d4a67c20433d54121efebaa86f2ccb59d24d
Deleted: sha256:3ffacd5a4745b97d114c8e0b4466ead9cebca5a0d24bdf94708f382047e70f2f
Deleted: sha256:ba92ba88cb7491ff302299cc0ad8a7b4cd58a1c6d50428c2073c0713cd09da96
Deleted: sha256:5fa79ce3baf88b8e768451ef1f6ad9fd4ffae162cc787cb6a8a8e5e2939a3c78
Deleted: sha256:f6e770c24bff1fb797172d6a2b3e0e22853a7ae7826a505f72aba882c8412c99
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9a9c02482490f91222c7038b2924b63cec94f72b9523b49ec9a2ce70ecaa819
  Associated tags:
 - 20190703-132714
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-132714
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-132714].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9a9c02482490f91222c7038b2924b63cec94f72b9523b49ec9a2ce70ecaa819].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3709

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3709/display/redirect>

------------------------------------------
[...truncated 207.78 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_05_09_06-8987770049284610358?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-03_05_15_40-10014152392539054298?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 800.483s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-120009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:59afbef89aacdbf259d156ab55e422cefe0eb77a4c0c657fba2097e479017307
Deleted: sha256:385b24a5a9c5c44721b8766a5d3e863c1225aca3f046c55fd0abf1a3866ced51
Deleted: sha256:ebdc060b30906707e4e0c04f77f548d05ea6df2ce95d8ffe050b6696694f6dff
Deleted: sha256:af74a8506739de637725110ac10bceaa9a968bce463bcfa577f9a96f1e57e50a
Deleted: sha256:b4173f47fd9d060c7d5b625441ab98fde35f85764da71ae3d597cc06bb4150aa
Deleted: sha256:0d8662f702b5a72e9cd0be7f828ea8c6a64104288270d479da05ee5c397662fe
Deleted: sha256:e58e667d12f477915cb76c28fdf925437f9b9ee29d272d64c0fd9304a5f83400
Deleted: sha256:6d3678e5d7c1fea3ad3f983110a3c14af92747a77834dc2ebf9596cdc5df13f9
Deleted: sha256:80525fd5c25c4271497124bf30cccd3699363e338920ecf1af022d84291505bb
Deleted: sha256:509f178dc75757be8e6e6ae45bf6f8e9a95b5c2c9c533ecf2ef72c455c533fac
Deleted: sha256:3c049da047721bff89e16808e80d9fb66922c1580285c2042b34fb3ada7619f1
Deleted: sha256:eb32709ca125bde8ff6c2c84d720c7bfb79d14e0646254b35bbbbd0141ebdc5d
Deleted: sha256:dbf4e35e776e828c2eb5c19b7a2b6d75371fc663b2a08c3d3e8ae253d4710ff1
Deleted: sha256:71dd0ed5a5f5f590d58515df09c1998cfbec54d02d060a0c3e3df63bb9d8d9b0
Deleted: sha256:4f56c681d25714d724b65da34520a7172004d6ac7b4ea13603eb6f5f54de0e37
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:59afbef89aacdbf259d156ab55e422cefe0eb77a4c0c657fba2097e479017307
  Associated tags:
 - 20190703-120009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-120009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-120009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:59afbef89aacdbf259d156ab55e422cefe0eb77a4c0c657fba2097e479017307].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3708/display/redirect>

------------------------------------------
[...truncated 208.37 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_23_08_26-9004033485177574791?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_23_15_04-3701900803582736607?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 789.947s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-060013
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:576500b2ff11e126f3a1ce1dafdbbcf49e8d31e1d5584113b58e79e0577cae38
Deleted: sha256:b7892cdd488fa755d1aedc268de78c15c973f4dfe7b6c4b536c01b751f5c2949
Deleted: sha256:280a909d14b97b4ac6624a0b8005f7e046d16b9c366b961ff4b4018efaf856fb
Deleted: sha256:28fb905f46adb0e2e126916f714b898a75646bb3bbdfedfbdadced9faca3992c
Deleted: sha256:2c7e67784a4598c40ef81339d7d50ec4621076892c67a54e403d31319e1015f3
Deleted: sha256:0d955c3685a683c7836c738fa1a6c463c51b761448fc1c4c0a474d4fe41bc567
Deleted: sha256:a58548e12a1006c91662223e4a0896134dce4bb7a7616125134fd2041b1ec331
Deleted: sha256:ec3b2a218e97f874720c337d0b2bd51362f801b32c6246aadb74775e324c6030
Deleted: sha256:99830024467faac658087c55e63e3c47536fd52e9787c0b923f56a4cfb18baa5
Deleted: sha256:aedce23eeb2e6f11b0c75c5e01e47d4c500c6377e0e37e6ccd5ff6cdf10aa01c
Deleted: sha256:62ec5d94997db4f56eae38a753ca3ca00f44edbdf9e39d53a363f9357379027e
Deleted: sha256:2ba43b0c845249575fe9fe503396ab091ec3d8141f3ef158f2cbb2716d94a245
Deleted: sha256:33cbb63c004af1fb4480ef74ddc3829865fa047f44faaf39200ea1cb40763b2b
Deleted: sha256:562291ffaeb2f8b0b33ea844e3060035abe2480868375ef97ac3e7cb17b4b5ca
Deleted: sha256:5c7f923712bc96db154329989ee4e41732076c76341506aa2e708bb790f24927
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:576500b2ff11e126f3a1ce1dafdbbcf49e8d31e1d5584113b58e79e0577cae38
  Associated tags:
 - 20190703-060013
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-060013
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-060013].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:576500b2ff11e126f3a1ce1dafdbbcf49e8d31e1d5584113b58e79e0577cae38].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3707

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3707/display/redirect>

------------------------------------------
[...truncated 207.39 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_17_07_24-7151685985290410929?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_17_13_47-11217322693237963728?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 789.840s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190703-000008
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:598863df86fbbf56cc5b4309f76e449fe3370fc19a0d16b620dc713b692d265c
Deleted: sha256:d1c38b6df6ca66f2d65eb4c85a780502bab6c21faf3e326b924cac8e92b46a95
Deleted: sha256:5d54fbec7b156a26a60c0bee3c25ec3b16072fb835b9af5f9fd40d16bb6473b5
Deleted: sha256:68b69469d168e8b16d93ac0a61085babafdeb1674e89d5d0a5d78f254c711aad
Deleted: sha256:19744594fb06481bbe398f8fca174cbbccb365b58d059c08d776df02cf28b0cc
Deleted: sha256:1f6b42aab0c74d9154eb8b6ab2644c9296d8448592ed4a8053b304155f0f5c2e
Deleted: sha256:012ce0bd3de63786f23ad6b688a1bf2528f9f4bbfd00b7b6e34459dde8edf23d
Deleted: sha256:a65eebfe768014214e83e94d17fc501f0e08b8a26db69d180341031f242d67ee
Deleted: sha256:16363a8788f84db8c0c68fe2839614b69d02c3651912b5c4af148507e1cf6d21
Deleted: sha256:4b8eebe3ba09db0e08fc702046a1d6a513be1f30a50c80c5ea3459e2108685fd
Deleted: sha256:8929d8d3d4aa0529c8897a1c25adb7cc2f16aefcf7ae2f4a33691ca2e9d90a73
Deleted: sha256:f1a0afca1688f46f2ddd047f4a489ae679eebbad63d76cca8a38473bb67378c5
Deleted: sha256:acafd95cf2369c811ae5d734b0a7ffa5ebdb9fe48a1c7eca91e62fe6ba22f470
Deleted: sha256:5d5bcf5ecaa26d08c3e3d902708cdc24b59cca347c3859128c8c175acc31b6a5
Deleted: sha256:5f27429ffe1382e5c2a7d34047ddf35b1fb149941f74db050f614873045f66bf
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:598863df86fbbf56cc5b4309f76e449fe3370fc19a0d16b620dc713b692d265c
  Associated tags:
 - 20190703-000008
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190703-000008
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190703-000008].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:598863df86fbbf56cc5b4309f76e449fe3370fc19a0d16b620dc713b692d265c].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3706

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3706/display/redirect?page=changes>

Changes:

[juta.staes] [BEAM-5315] improve test coverage bigquery special chars

------------------------------------------
[...truncated 207.81 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_15_10_58-14950782493778738380?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_15_17_56-10994450702539444910?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 854.534s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-220323
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2a1cd5709c3c76f302bc269e9c5504ea1c7aa0788fc99c4cbd72adf7d4ab895c
Deleted: sha256:1ee0707e5098cadac47110ffa684daf90b86f8d9fa5042eebf7b48f901af6149
Deleted: sha256:8edbff6e4a0b05060df68bb23c97fc94610989cae1794631b16477637e904502
Deleted: sha256:666a02c25e00dea430ec0ec815c06bd539c70b74169802f930a34cdb07ca2b69
Deleted: sha256:6b3b232f6fd29cd00e210eba71cc08f75d6f4b93f7ff18ce0af1b8aca79e4b4e
Deleted: sha256:185c83bfaed634ce2174e57cc23052b61e91e3d4d24c5729c0459a0d8dd977c7
Deleted: sha256:e1ffc7786b6e454e55f644b03f7259ebae768875b89cbd0a15c1723eee093ed3
Deleted: sha256:b473c390c064ecffa68572b0128d40d35ea0d72f20955cea6472e9cec4d9b7de
Deleted: sha256:87c89f0f08bd05ade65b58693ec01f21552b6b74ccba47718c55369a024a9128
Deleted: sha256:ef7673aba6b30688e745800a2033eb428a9701f2a285b954ca68a4c957c858d4
Deleted: sha256:0c48d629c866270ac4059a075aaaf2ebcc51f7e44ca6d3495ff40c3b3dc1969c
Deleted: sha256:11fe30fd180e2b216bfada3f647011f767921f53b4b3fdbb421e28492e14f357
Deleted: sha256:adc996a1dec14e7aece2a6f4f6b902239831e489f665ccbfb82bca70a803b759
Deleted: sha256:0214bb82a476649967588f1854dbf9cf64f76ba203e64b7516f6915027d44852
Deleted: sha256:abe3660faaa48ba8b912fd0984f0b10627046c9642708a2db52a11f4adc78ac7
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2a1cd5709c3c76f302bc269e9c5504ea1c7aa0788fc99c4cbd72adf7d4ab895c
  Associated tags:
 - 20190702-220323
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-220323
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-220323].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2a1cd5709c3c76f302bc269e9c5504ea1c7aa0788fc99c4cbd72adf7d4ab895c].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3705

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3705/display/redirect>

------------------------------------------
[...truncated 207.30 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_11_10_10-11107943754467233090?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_11_16_44-9779842156111470113?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 826.072s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-180208
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:4444800c7ac1e88dac85f258838c3da5e5e332ecf71fdcdffdf19dbe8e548280
Deleted: sha256:afe7884705f5046b1a0d922ea8d003df9741f50bf57fdf15d8a68fbf21a42da7
Deleted: sha256:e76b103d4fb8ee0fba438213535227f71ebaab1495ca353cffc8ee906a79bd2b
Deleted: sha256:905208c71b89182b9d990b65b2ff05eddbc2052552f38a1c0a5daf6b08cd1a1c
Deleted: sha256:646b182142bf9dad1b75e3b7a6eeb727ac2561ef90be862c2977f85b1747976f
Deleted: sha256:bb5d18fba12981dc0fa14702f7451d1c5ade3edc9be038c11bbb4ed5a14c8e66
Deleted: sha256:4b4076aedbd0638e2bd18f982cbbc17e7bfaa75ded2a7a54d75155df19ba5fdc
Deleted: sha256:a4733fb4fa1e29a0c1dc642ec757120828272f0c88bf42cd0234c82be4802698
Deleted: sha256:5486ad8178602e51612d0b977b6ff006253f96a8d477f8d5ead510a4e7815676
Deleted: sha256:e8f2a93b8975efafd4f45cbde2570f7125f3e1cbf0da1e88b29c2b4ab9e34f50
Deleted: sha256:075ff23a27bf4d71e31e5a8a91496ed77a57ebe6658730294c929c8c830c8816
Deleted: sha256:2e4a661ebfa130929e6edb1facb92bbae454af7e49e4749653381db4e77a7c29
Deleted: sha256:f75320daefe512c6fda24b91273d3498d2184b55b1327d169792169990dac0d2
Deleted: sha256:c4d1c5ae511a38ef9075fa93068fbb5e407b35fbf9d9610182459996613a5780
Deleted: sha256:bb1ed1014d3c78799bf0b88b5c62ee6126fb3e7323974f38cbaf8a6ded3ab255
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:4444800c7ac1e88dac85f258838c3da5e5e332ecf71fdcdffdf19dbe8e548280
  Associated tags:
 - 20190702-180208
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-180208
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-180208].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:4444800c7ac1e88dac85f258838c3da5e5e332ecf71fdcdffdf19dbe8e548280].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3704

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3704/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7504] Added top_count parameter

[kamil.wasilewski] [BEAM-7504] Create Combine Python Load Test Jenkins job

------------------------------------------
[...truncated 208.25 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_10_05_13-12516357940795460149?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_10_11_21-669043431429153585?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 768.829s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-165755
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:df2f3bca1f5a24e6349101d77f432239a20c2d376c6926c1e863aa0319b0e7c0
Deleted: sha256:7daccc87bf472a28cdaac47351b4b1a864f6b62dba3719a847c3324e2a1c0c21
Deleted: sha256:4ba98c553dec1b28695c5510fefbd059bb3cea7818d5f93413aad515c06f7074
Deleted: sha256:47f1782237152d8bf55c0fc19863fb207b15fe7d335c24f1d1368d13a326e0f6
Deleted: sha256:b57c1d9aa30eb7025e4cc8ce7d6c50de39ca6bd3c472eadcff342db313a5fa09
Deleted: sha256:7c6af6a321eb9e86a5169eadc0815d8ce0d9035064e5904b18066613fe6d3233
Deleted: sha256:41b3cb9341ef7957122b71ce1069052b09c53d35cb1cb601ebbf47f865979580
Deleted: sha256:0c6faa6b4291fae395f15224a60341a07b699da03afcaa1252516bd0b9b8fd4b
Deleted: sha256:51a33eae07de99cabbda61058606e4cfe1e8a5edea594aec0e903ecb3f3ae039
Deleted: sha256:2da773acb68be59732b90159dc8530f2913d4d9d39ccf96d1d3a3e46bc9547fa
Deleted: sha256:0374f77fe7682f42d7e4c152a9c38dda7d71425a9e47c9544e0716a8d7d4d4d8
Deleted: sha256:6f749914ec17347c9e0cac50b57843469e68cd357ef4f5953f734611a27fc70a
Deleted: sha256:23d6c3d53155de234b12d92a0ad423779203d2503d12e4de7f69dbe1db1b3165
Deleted: sha256:47cb1b27537a1466de7b210f9e55a897bbfd6a39a67bc20d4a714a1876a13f9f
Deleted: sha256:a9427185ea2260922caecfd96227407aee42a0f8b8258b03c5e9d205333b6ad2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:df2f3bca1f5a24e6349101d77f432239a20c2d376c6926c1e863aa0319b0e7c0
  Associated tags:
 - 20190702-165755
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-165755
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-165755].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:df2f3bca1f5a24e6349101d77f432239a20c2d376c6926c1e863aa0319b0e7c0].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3703

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3703/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7640] Rename the package name for amazon-web-services2 from aws to

------------------------------------------
[...truncated 207.52 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_09_14_57-17861540338672109338?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_09_21_16-415108671860983704?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 735.160s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-160508
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ae13728f56a0cbf003e4f0a510df3292afddeb51f32493e8dfb7ec85aab7532b
Deleted: sha256:cbd40ef02200ad2473761de64060130e50a163a6d1845beaaa68dde94247ea97
Deleted: sha256:21412ef4c5cafcaf639ca2e00400e38bf509b15b782237edd039e845ea30fa03
Deleted: sha256:4979145dacc35ad222d5eb40243623e271bf6fff8a2a258e3a71719145683af9
Deleted: sha256:11263dac81cbb269d1e27964b6595822dcf15e033beb227065f23c550b335ae4
Deleted: sha256:29b84ed3e898014c9c3719fbd3c18e28f86ef0a89c24c1b4ab24f378146cdf22
Deleted: sha256:944dfdf2c9cbd240882bdbd4b2f330da1f4be441748bf4cea86f8c7050c29a3e
Deleted: sha256:5b7ca6235f3154ddccfdb71ec135c93f16d021ac655372c0d01822f9c67d47f2
Deleted: sha256:7f5aa52c079b4607e66849c3b65e2645c71c0529f02a41fd68a559ebcacccb93
Deleted: sha256:d22a5cbbcd034acfb22f4fd7c85f26a453b48782bb673af0fa3fbad91ae250c2
Deleted: sha256:45526fbf6464c46a22cc7e3a75b535ea2af9549f7b29237251a216573548322c
Deleted: sha256:9e5fa2b9fbbed0ea03aec5ea3d95f4a1f86ca87a96c8848b82ca5590722111cb
Deleted: sha256:6efbdcb01631ff262d1938b2af60037c89bbe044a000d75f55dac817f088a1e9
Deleted: sha256:76a73ffd2cd74e12cd1176f70009831fa43960e41293d63a751f1bed47fa0f66
Deleted: sha256:65573fc78a589e0642d73f8ffcdf3745fe5b851f4e3acf55cf49e9e2e108174a
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ae13728f56a0cbf003e4f0a510df3292afddeb51f32493e8dfb7ec85aab7532b
  Associated tags:
 - 20190702-160508
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-160508
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-160508].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ae13728f56a0cbf003e4f0a510df3292afddeb51f32493e8dfb7ec85aab7532b].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3702

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3702/display/redirect?page=changes>

Changes:

[daniel.o.programmer] Update python containers to beam-master-20190605

------------------------------------------
[...truncated 208.36 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -414: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-356\x12\x04-354'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -414: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-356\x12\x04-354'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_08_40_05-2219288925707518971?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_08_47_09-16003912165937940398?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 804.452s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-153207
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:7ae85fd272783669493e925f171ebd6c9f750a287019439abbd136e8f86e82ec
Deleted: sha256:304948ed198acc740405dcaa0aa243bffbdbc7853248c05aeaf64a56bdf82a6c
Deleted: sha256:d314bfaf51df469713261b4faa8330ed3152d219cedce6a1d287bd86ab9f3143
Deleted: sha256:b426fba5d910e39b99e74f637b158124285c69cd2703464a4816d4715b0d42cf
Deleted: sha256:416af233bcebb2f7551071db0e8148c7b491e898dc3546abefa7ca3c46fccf3e
Deleted: sha256:20949fe8fdab2454332086a04ecd6cf97096bb4132a9c5df614a9207d9b64aff
Deleted: sha256:0609457b7db74e34f618c0e434b1fe705cc751261d703101d49b6b11826a1eb0
Deleted: sha256:11efe9e8c662e558f065f0ca02262e4d1aa4778b71d0a7ec007e7fac0f59f3ef
Deleted: sha256:13957d78e9c4e88545c3e508f686be211793bc79bf23986a517403cbe491d66c
Deleted: sha256:22985ee9deac2ae95ce18e84c570f539608e45119930e521eb12bb4b969cb336
Deleted: sha256:16db7943fd901c6b743d917c821f9d44358e32a585d87a680f5892f2c884703b
Deleted: sha256:df9deaec0a17ac7dc202d6bcad67750504c742e4bf4c13235c87d22a78c7a106
Deleted: sha256:d770238c715ca36d4fd711e4edb048a9a9d53ea69078ae16e33b87e3fe1577ab
Deleted: sha256:513e886f1cb31016609ba9743b13f064fc11f044d18c1dd64629750532ab10b1
Deleted: sha256:a56ae218aee6e7c33f4855066b4bd54a1648b4bc212066f13dacd01efcaf5f89
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:7ae85fd272783669493e925f171ebd6c9f750a287019439abbd136e8f86e82ec
  Associated tags:
 - 20190702-153207
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-153207
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-153207].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:7ae85fd272783669493e925f171ebd6c9f750a287019439abbd136e8f86e82ec].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3701

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3701/display/redirect>

------------------------------------------
[...truncated 208.12 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -379: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-321\x12\x04-319'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -379: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-321\x12\x04-319'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_05_18_58-8606748156447795834?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_05_25_17-1767413820336237218?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 759.096s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-121148
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:e0105589b3d0a126d38d45887e2a8a4a19495c4caa482c806f57deaa3d9ab248
Deleted: sha256:093239b013423284e634bbb8db542df7b800adaeaf009597a62a844ff216e7c0
Deleted: sha256:1f6474e593446db3e0cdb78736e6e956e426b2b28b49ed1a825564c5cd4699cf
Deleted: sha256:e61c1b79c7ed77a0c19c775dc2631aeb456abd7ee262df9f2433f686f4c3b43d
Deleted: sha256:0dec01143ead7ac8e1b806b21cf9401eb76a707858a860aadd681c723b8b903b
Deleted: sha256:f6b86d1c606739d0d7cb889e5abff5e7ec29ac7a6d4e0f549e6c0b728b72d5c1
Deleted: sha256:b328578e5db710cae450b0857f7db95fcf47e5ef3b6495b438520e604c3a905d
Deleted: sha256:7f7efb7719a501391ba50351c383774b2d56f8afb822e4a8a6a184c708f605c0
Deleted: sha256:aa42c77f60e6ed22381c6cb81431577c366fbac43eaf7d323165bd2f046c88bb
Deleted: sha256:456276da7d158343704fa6d041b5351d05bf2b9b32eeef28349c8d01d071fb91
Deleted: sha256:f2e430ab53ddc98030950962364baf586163054cc4042644abafa8e2b4e96c18
Deleted: sha256:3d4fa7f15898275aa5dca6c0bbc61e5bf38164f7949590879908464e4942b11c
Deleted: sha256:96e4c17add6e2cfc762bed0507ee68bf331c7bf02c8d4730d1df07b196472a84
Deleted: sha256:2232a4c95859bf0b8144984324f1f664152730544de0c1143a24fb5bb3e1a5c7
Deleted: sha256:5e08e6cfface0427a1a541cc383b567ff47bafe39e8dbbbad78b448c23eea80d
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:e0105589b3d0a126d38d45887e2a8a4a19495c4caa482c806f57deaa3d9ab248
  Associated tags:
 - 20190702-121148
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-121148
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-121148].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:e0105589b3d0a126d38d45887e2a8a4a19495c4caa482c806f57deaa3d9ab248].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3700

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3700/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7536] Fixed BQ dataset name in collecting Load Tests metrics

------------------------------------------
[...truncated 207.92 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_03_37_12-428570342762003146?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -131: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-109\x12\x04-107'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -131: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-109\x12\x04-107'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_03_45_10-8821825635959112573?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 954.097s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-102931
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff50100c8711b54505ead1b50b035379d203da5966d8e2de29d41a4487a9f624
Deleted: sha256:d186cf5faae955a9a535bddf53a92867c83ff865cf4e6969dc1365fc04dc446c
Deleted: sha256:7f146be85e31967186478f5f26a62e4d451917713c05bd7cb84e469a06940bec
Deleted: sha256:6cc4340268b39c8b8f44739376afc14ca9394b2419f212f11d4e370d6452aaf5
Deleted: sha256:00c5a334d915123d311ecf506828944dde1fb1a8021fd8cde893cd5f4cebe1c1
Deleted: sha256:076937ed4777bd5579e991bdc8c8628e44b4a34570618f9fd199c984c7ebe564
Deleted: sha256:3581aa9fc578d7ea6811da4408d294cea68ad9918bf77c95e2a1f1edd7f87b19
Deleted: sha256:bff623e455f35808bf29053fb1f4e6bc64f2d34d63dfb3fb1304919c4663e383
Deleted: sha256:a4075a419aa7071da83423a48231baaf3484eacb5dbff259e951f2f57d72be21
Deleted: sha256:d92e786a3ba9ed33b988aa7e0ae3c0d9ca2b5f4690757f655f577be78111e200
Deleted: sha256:4b318d47369ef1fad6f87d98786b8549fddc9b353ba4a779b9ab9f3e90e0667b
Deleted: sha256:6222f036baafd98330f291729fcce3f960d399d6d7094837fbb1c3e8e15b1b92
Deleted: sha256:6ef73b180f415ddb4bd4b4a1635f1a03e1454e8471ca7a52fe7d2b0eafea44f2
Deleted: sha256:3181e940bafc365fbdd761d5b8f10d181fa50d4660a344aaa8cdcf93711cc27c
Deleted: sha256:0a19cdad36b485b17cc889484ac934eb12b27560a3aad20d6117fdafdcdc0510
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff50100c8711b54505ead1b50b035379d203da5966d8e2de29d41a4487a9f624
  Associated tags:
 - 20190702-102931
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-102931
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-102931].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff50100c8711b54505ead1b50b035379d203da5966d8e2de29d41a4487a9f624].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3699

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3699/display/redirect>

------------------------------------------
[...truncated 208.35 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_23_09_00-6522216733531613032?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_23_15_38-16798764953135447462?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 759.003s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-060023
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:be6df43081201e155e10922e14996eaa9d3a8c331c164197bdc41113290c3af4
Deleted: sha256:65d8980123311d1140b92c1a0d0e5691b86fd2fe67f687d3faa442726172e9b2
Deleted: sha256:c696ac47d42e502727f269c1dea8c5bb9893bc59bb58539b00424c75a0866658
Deleted: sha256:7594ff2a57755dfdce8289122874f0183e49fcce938d4e49ea2127b16c0ea782
Deleted: sha256:6eead157217d01d39eb6cdfbdbf7c99385939922d8fca429c4e5e9c0071ab5e7
Deleted: sha256:3717d2f0c8798abf6c081065d528986fbbcb2059aa66601f1375bc113b5c9a4b
Deleted: sha256:6ab4191601820c7da6214672e3abe90712503bc860b7b3a7c1742714d55905ea
Deleted: sha256:77950f714de0b2b2995a4ce0bee10628974cf7bba6f160c34a8c7342dcabc604
Deleted: sha256:f52f90b09c499c82fcaffdd94fbdaea1c87ad1fc0d713f6ebd4d353396b87dff
Deleted: sha256:1f9dadf182b3b8be7a080acba2f90ba998abe040d068def2bf7e4fe1d5d388a4
Deleted: sha256:a91c1868fa4433ced86f7cccb095747db3a80c8d6e2e9f88a8a16476cfcb53fc
Deleted: sha256:e5186993fa8f85e4866f1df7abd7316fe19f30c8fb86d056714b72d59ecee34b
Deleted: sha256:7b65b8fdff3b7f4cbbacecde46e3bdb26adea247a44df313239fbdff8f16b06a
Deleted: sha256:e0c191248619fadb30ddbd39f579084f1cdcd72cbea8ed805a95a4ccdabe2928
Deleted: sha256:4bdc5b374f203bc6ab48a9c0afb3c99dd800a0f238167cae01daa0fe5472d85f
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:be6df43081201e155e10922e14996eaa9d3a8c331c164197bdc41113290c3af4
  Associated tags:
 - 20190702-060023
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-060023
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-060023].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:be6df43081201e155e10922e14996eaa9d3a8c331c164197bdc41113290c3af4].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3698

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3698/display/redirect>

------------------------------------------
[...truncated 207.41 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_17_33_36-2246884866252425506?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_17_39_54-10830800617907297680?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 709.761s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190702-002544
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2be9dd9910ee246a85489079e5c76395743decc7b4eb4dc02816b9f0841c9ae4
Deleted: sha256:a9a48cbf669ab0013c5f0aa46b08c3a2f4a80b14f63591136383a36d3b3a329a
Deleted: sha256:4c23b3c4ce057ca76582104f7159910e8cdbb9d06883c03117e64a83c6950ec1
Deleted: sha256:73329a543cc25a19b69b7c01c84c6c7c35db5ced1b7d7e2950abbd94dc9355b1
Deleted: sha256:8274180a92a2264a7290ccc14edb16a3995b89ec0de0e6259c99b4c28e9cc314
Deleted: sha256:a74f4c4681a9d61f82127e8b91628c3f74db15c2234d0adde3045bb97c4ae4ff
Deleted: sha256:63581655b0d9f33a23a8951e2f02bdcd23bacb2ee7d5bb96800c787fd933fcaa
Deleted: sha256:0a74f149a39dab849e5478ff8878992532ced4db93207f22476a3d499dae6482
Deleted: sha256:28784da26d9a3489b2e94485aea4f4508d3dc1a3d494adef9fe5d1eca32bf13a
Deleted: sha256:fb20f5c340e16a7b0d8b711b650d67661bbe0821b34b797a5580570cf670bab4
Deleted: sha256:c7013dc3e8d7724d777b0fbfca2a83100a8376cf7a4abec4539f259af96e68ec
Deleted: sha256:530e4970c248c361026e46112dafb7713a0f775ddb57cfdf64c87f5676f72111
Deleted: sha256:bc7e0c6853ae678823c8b73ba129ff644ebb52770a984cdec2beb774a3a47bc8
Deleted: sha256:a768204c0e83531077f6c073923bd2a6c454df5bb9590164380afb431f5ecd53
Deleted: sha256:42eab8683067e7a9a8c427a5f355c777dffbefdb2051937ec671fb0134b1b4b6
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2be9dd9910ee246a85489079e5c76395743decc7b4eb4dc02816b9f0841c9ae4
  Associated tags:
 - 20190702-002544
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190702-002544
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190702-002544].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2be9dd9910ee246a85489079e5c76395743decc7b4eb4dc02816b9f0841c9ae4].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3697

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3697/display/redirect?page=changes>

Changes:

[github] Tiny typo fix

------------------------------------------
[...truncated 207.80 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_16_58_24-13100792213788527829?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_17_05_57-2180223769591991146?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 843.787s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-235102
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:0af072e749caf16306aeb4c1568ac6ed23c5b98468be70d2cfd67cebc201eae9
Deleted: sha256:0d45a988f071f0d8306d6195db7d9833e0090091781d1bb9c916897e6cebd6dc
Deleted: sha256:a28b649a41040a56539031b685f38405a3b7866837fea1e089143bd6442f2aa5
Deleted: sha256:bb4fa1bf81e800a093a36ccfee220aba58229404dab85dd753ae2468d82cb31b
Deleted: sha256:25724a22d666f8cc2fdc951fb1ce28e5e58743f01cb500c2bcdc8a115c4b2e49
Deleted: sha256:95d28f0b444f5a41d2986a1152211427245dfc417cc1dd8c28ed82ad27c96de9
Deleted: sha256:40fdce96099fff9de59ae5289b34fc3227f4efe52c75a64e946c74038e79eb88
Deleted: sha256:f19927d98f92ac6c06f25da9504f3251d12867c3dd99c4f57a400caaefc8a9c1
Deleted: sha256:fa2c3f3254450d2a5482f307d7ea5412e9ae542a05138fe8e1f6611248277bd2
Deleted: sha256:bf0480ad1e2ce32f8737d80c731d5dbc02938e513140694432f711cf69e3ed91
Deleted: sha256:995cbeb58dd2c89d6d31c47e1903bfaf24e49af8e2e961035d4c09bdd06f23cf
Deleted: sha256:3594402fc4e848d23a222d12a02992944b79c63170b8d7e0362fa9394ceaadf5
Deleted: sha256:77a106bafb6a9b28852301490bb2835fd7ce53a7c730d4f8f5f03556027c9485
Deleted: sha256:11fe9f27d6777d812f193244993c0a42a34963fff32ac4b8ec60c6fec1bca98f
Deleted: sha256:9767e360b60c73e18caef1caa379b8e9ee984f813093039e75c9ed5cbc0b40df
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:0af072e749caf16306aeb4c1568ac6ed23c5b98468be70d2cfd67cebc201eae9
  Associated tags:
 - 20190701-235102
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-235102
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-235102].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:0af072e749caf16306aeb4c1568ac6ed23c5b98468be70d2cfd67cebc201eae9].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3696

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3696/display/redirect?page=changes>

Changes:

[hannahjiang] BEAM-3645 add ParallelBundleProcessor

[hannahjiang] BEAM-3645 reflect comments

[hannahjiang] BEAM-3645 add changes from review comments

[hannahjiang] BEAM-3645 add thread lock when generating process_bundle_id

------------------------------------------
[...truncated 208.64 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_12_47_27-7352719071507165178?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_12_54_30-7919707964121907924?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 829.663s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-193823
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff8dd37e5990e4d0368c0b07865bfa1b7049b0cbd7274b04a62f4597c90670ac
Deleted: sha256:ce609ca359b2f003d9b5167ed26d2b5e0143b51022e002568c74efa419ecc73d
Deleted: sha256:ae6e1321ad665aa9bdfcca3e505a9b54f7ad13835619f3b27ae9230ec890ec7b
Deleted: sha256:d52a3fdece78c33de83d84546fc41a8d81db253babd8e7552456d840ae8d29d3
Deleted: sha256:b56cf30e923ecb9b176d0bac6dacbc830da8d9b1c937288a673724f0c7584edc
Deleted: sha256:d7f308ac735f7a8db43ba3d27cb741834ec957a6f428c3faabf303dc614b98fe
Deleted: sha256:6654175892c5b1cc8268d4fa161817e1aef60529f2129c3c5418cafe99338067
Deleted: sha256:ada89d9284487f553f31c099239b431e9b0603cb62c8e02e5b485477894c22f5
Deleted: sha256:6860ab1420d30a742e0a1f6ffdc58e75f4fbfa505daae6babd8674ce051f5047
Deleted: sha256:dc99654ac4315930547c24c2bb744225aeb8462a9c61559fe98e4a7e42d9e29e
Deleted: sha256:29acd9fc8dfb7ca3089f0f6b56cb26e3124abfd6cd958481834452d100f9421f
Deleted: sha256:12be22230d37b0a5a0e025c4919f7924d2e4d3e532d8ee327d6d35e9af56dac5
Deleted: sha256:b5256b0f64dfb01774f77d2f2766cab0e6b426fa06b70cd1187031ae5dae62ef
Deleted: sha256:b8de197027227a14cf95f5a62a15a0ed0606ed8edb4518f99246517438497743
Deleted: sha256:59386f13982e04441c570f223765742bf204cd72aebbbba66f6f186a0ce0ecf5
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff8dd37e5990e4d0368c0b07865bfa1b7049b0cbd7274b04a62f4597c90670ac
  Associated tags:
 - 20190701-193823
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-193823
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-193823].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ff8dd37e5990e4d0368c0b07865bfa1b7049b0cbd7274b04a62f4597c90670ac].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3695

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3695/display/redirect>

------------------------------------------
[...truncated 207.81 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -446: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-424\x12\x04-422'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -446: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-424\x12\x04-422'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_11_44_30-17571315390522134697?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_11_50_53-7281847916252296748?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 889.370s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-183410
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ec6c7ce1777c4715899e3c832d7f2633fe55733d866d6f5e91d25af2adaaacd2
Deleted: sha256:58a00ee2d847355a823dd25005cc59b6bfc43aa8a451375d978789d25ea6a4c3
Deleted: sha256:0f3aa4826b0f3263f3c004b7b689f995f72bf527c999f3c1daf09c77a2927c0f
Deleted: sha256:99a78137d0cc1c46ed427e64adb635459447595b6036e65642feb3a7493d31ae
Deleted: sha256:906494a3329283cdf489e39ec2b4c5c2cd9990f86ca02fff332ea4a2065ac434
Deleted: sha256:d6bcbdec1b885c5dee210ad6eb96751c0355e5f5d1e16016da1f24f3c7b30831
Deleted: sha256:9d34a4a4ece4320e9ee46a51734d4145dd42bc7378b60d1c4a0f434ce90ed8af
Deleted: sha256:2d258aed96e6cc57356b5ebb970674de82695073b80b8aa43bbc3de07270809e
Deleted: sha256:aa0edd4b9688acdcede2aa22fce6a8d245fde810dcab4709d2c5bf14e05caa29
Deleted: sha256:7d4756ab17ec614e7bc485b5f2b4cad2e3fd37931394ddf5b3178f594e14cb83
Deleted: sha256:e22ce3d2275639b62e6af70908bfa715fad82240fb0d7ad3b49358523a590a7b
Deleted: sha256:4c57d15dc3b5f2d6fd26dc2071750d0e27e69c4bc8ce59827796686fec9fcdfe
Deleted: sha256:d5a933b024e7426b022fb33617254c59268f1e299bb942e6a19f2cd28b73d323
Deleted: sha256:b585c13aa453172273bcc741030e3dfbe4b662a0ef5ba1f2c6c0ed6a470e1800
Deleted: sha256:efc91dd9fb4ff6620246e4c512b80d13955742af3a370509759b13daa7c26129
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ec6c7ce1777c4715899e3c832d7f2633fe55733d866d6f5e91d25af2adaaacd2
  Associated tags:
 - 20190701-183410
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-183410
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-183410].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ec6c7ce1777c4715899e3c832d7f2633fe55733d866d6f5e91d25af2adaaacd2].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3694

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3694/display/redirect?page=changes>

Changes:

[alireza4263] [BEAM-7545] Adding RowCount to TextTable.

------------------------------------------
[...truncated 208.00 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -317: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-259\x12\x04-257'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -317: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-259\x12\x04-257'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_09_55_40-8587369215981780305?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_10_02_19-15799174409797617061?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 799.438s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-164733
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:015f492554fb8bb6465635fdf3c2b27b74bc54d6a2cce6db9620ee644d226d36
Deleted: sha256:7dc3a5bdf99ed6265b3bad4a618d12a627a86c670a2ec0788d7ad5ca8122fd89
Deleted: sha256:117c3a550336be147737d97333547d71d69645654569f527d94894686558a07d
Deleted: sha256:9f07a2b86b60f53f776d54e0db63f8823481c816c57930aaefa866b41bbb838e
Deleted: sha256:e1906d2c523d98ad056a79a440f70fe4c32fd7f01fb8afb65f0c580e2cdb7995
Deleted: sha256:4bf81b465ea38cbe85551adb4319242987ffca8213e8e32a8aeef0363613f1c2
Deleted: sha256:0f846e210b4b40ce00c19c30a202976c1577169b90b5798ab5af55ba15f186e0
Deleted: sha256:d90e70aed17544ddd4537d3523bb965b6cd45030e66b00f4195b622fc980db68
Deleted: sha256:3db7bad8884ec31c3bf44887c8810392aae92ae6cd6014600b6306e81ec074e4
Deleted: sha256:01d1a8ce09f21d15bda98986f5dfc3df2b63009a912d033730388d11b63b11ab
Deleted: sha256:2e0baedee641bfe2c551078e530aea9fc4d3f7f6ba0a7cdd3af8823567aee392
Deleted: sha256:01def450d5642aaff856c2fe2ef73f45c901350478c36830dea00f6cca5b8fac
Deleted: sha256:aaa39830a4d147cd29622512c0a734b6732d7208ad6f9585dcd3dec355961eb6
Deleted: sha256:919d409554ab6cda4ec9ac390b627a5f902b179611670a547b1aed61e316eae2
Deleted: sha256:ab9fdb5142a35ce6bcd4ccb629f30e7c8630415a0dba27aaaf5bf6863bbe70bd
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:015f492554fb8bb6465635fdf3c2b27b74bc54d6a2cce6db9620ee644d226d36
  Associated tags:
 - 20190701-164733
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-164733
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-164733].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:015f492554fb8bb6465635fdf3c2b27b74bc54d6a2cce6db9620ee644d226d36].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3693

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3693/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7640] Create amazon-web-services2 module and AwsOptions

------------------------------------------
[...truncated 207.92 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_05_58_42-169823318733672031?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_06_04_50-9698400765863936851?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 750.189s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-125044
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:4eab22c51501a567bd918962e017750a9de2f601d2fe05b38edd68d237600c18
Deleted: sha256:7dea568ac202b30d4d0f93c2d0e875ff4d7a8d328b982c3ab7fe01e02635ef1b
Deleted: sha256:e28a2d189728d503bc2e8e60137a8e0188f731c70ab9ef91fa8ad657c480fe6e
Deleted: sha256:c04aeb11f826fa941145ce8153a90975a862b6ea61d97a852d707bfadf847fd9
Deleted: sha256:2619165a41349f0f8b8d1e7e2da55cdf44323532f553e68e76913e29c7d52059
Deleted: sha256:cc533d38b0e75236e9525fb804c0921cabcc30910ac51c9011d97a0937699f63
Deleted: sha256:6e9a16e8bc450cb5dfd3abd87a381fb2b79aa16a77954df9344e79c514947697
Deleted: sha256:a78c63c088ebdcd30c2c9f99a0744bd6f3bcfe7d082a85c84cd4ab8a9c651f38
Deleted: sha256:fbde2cf2a721f2651506f16b6aee359d34e57ec550d79625d5eebe18a6878276
Deleted: sha256:8d703206d34ad8dc5fa224b655f7c9c624c018fbff0bdc268c9888a9066bb3cf
Deleted: sha256:2212fa38974116fe5444dfa94518ead15ca5db603d21170c85f4ffdd38d3bb69
Deleted: sha256:fbd649927d7f00f3abf4266d05a29a296cc2ceaa10789636fe5256ee72a52533
Deleted: sha256:2e153d5d5f9676c4df23a469971dc0534b2e312caa6166b3a6960a18def0ff5c
Deleted: sha256:5b5648439e4ab87b16de2d4931bcc6c65004b354548c6954c59beb9de807cf0c
Deleted: sha256:1c506145dddd529b9c65a74cdca0a46b768f5c51f1f8faf6acb15033149fb529
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:4eab22c51501a567bd918962e017750a9de2f601d2fe05b38edd68d237600c18
  Associated tags:
 - 20190701-125044
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-125044
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-125044].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:4eab22c51501a567bd918962e017750a9de2f601d2fe05b38edd68d237600c18].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3692

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3692/display/redirect>

------------------------------------------
[...truncated 208.25 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_05_07_56-13902975674994164603?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-01_05_14_34-1913315805920410296?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 738.637s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-120010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b6f6203e679c1cce3436be3cce200c074d7da244f75f75fddf4a6bcc4303d6d5
Deleted: sha256:79898cfb79cffb95443b3a2969d1622f9269139da61ff390737b61450b00c7d7
Deleted: sha256:01322913b6155222fe4ef771c9348f78d5c7f600cc0cb66f06a471f82acfa0e5
Deleted: sha256:07513602fe31eae0f2d491b815ec72802a9451108125739ba79da12c28d66c77
Deleted: sha256:3958fd62f23278f8e442bddadf3f2c6b2badddd64e7815cbc87e1dc173ffcbd3
Deleted: sha256:fd7876993a8587d2dd8d363617e0253fa056c27be33034b0803e2b889bdde770
Deleted: sha256:2fcc3d8a3b1ef8c0446b67f4b4f92743b69868eef544151b77e25ad8b9f0ccbd
Deleted: sha256:e039314bd5835a42c1a6aea01d4339ca5309a53461d5551a6813f1aa9b5ea7f0
Deleted: sha256:851515b6f67ff69996ad66c007019c37a2b6b66db41ffb67c8c27b0fc8d0d2c9
Deleted: sha256:12b0ca85889e17951afbac078754681da6a894ff6959428221338b6fe4b831a6
Deleted: sha256:77ecd26fb56b329f975fc8e83e7a7ba479c359bfb6e491e9be806b251a859ec3
Deleted: sha256:b820b70b6528d79d3c2cb494630ee45e39c6afd7cb8236a41a32df70e3f3abe8
Deleted: sha256:899fcdbff0d432c61559c40a85afbad7cffb3e367d0d2c6fe80e5c71044c73a1
Deleted: sha256:afb8486f5c28cde0fefc1883e924e3baf09d040ee0a16a3ae53b2d42c960b179
Deleted: sha256:bd444eefcc4ca89bcef6838740aa4628d670d39a7b1f105f34c63b7c420f80a7
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b6f6203e679c1cce3436be3cce200c074d7da244f75f75fddf4a6bcc4303d6d5
  Associated tags:
 - 20190701-120010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-120010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-120010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b6f6203e679c1cce3436be3cce200c074d7da244f75f75fddf4a6bcc4303d6d5].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3691

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3691/display/redirect>

------------------------------------------
[...truncated 208.24 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -410: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-352\x12\x04-350'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -410: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-352\x12\x04-350'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_23_08_37-6069273861588078448?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_23_15_50-12506290409114216380?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 804.924s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-060017
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:4cb97e29a64290f29cc142952358a1dbda898e7b4d103fdb8d721829b8eefe36
Deleted: sha256:cc283dc49f8cbfe108449f5b7145317e40875b6e2c6728dc5fef00bc7089a5fb
Deleted: sha256:7dff4bceee8b700924d7038d72a084573acf0c083cd6e4c411e28024753adc39
Deleted: sha256:a29fa8eb86f790fd31c20fc61696cc507596e22a88845b83cea278d87d9ea47b
Deleted: sha256:dafe22c3f7edeb829a10899a495bd46fe386d68b6da3122b8eceba53deaad01d
Deleted: sha256:285db745fa45d8306958b8073186e8d1ab6703b8574d4a83b02ca3b59a98b3f5
Deleted: sha256:29b7c58dfc1decfa5278f03d2a2230a19f68ce65a86ab963dcb77b2f59713081
Deleted: sha256:970828864eed07d155dc69558cad7ff540a4af1f86fa31414067d531cb66107f
Deleted: sha256:213fd57f13cef5ff49b90252155771757ea0d5d8fa219284fc6efbf6ddcd71e8
Deleted: sha256:721eeded964e5c4bdd87930a68b675482de51cb1e10fd34ec4f9dc972f8ef00d
Deleted: sha256:4683cb06610cd7971a561d45e27fe4f5216184f99dc0080e68c799b1cf9dca7c
Deleted: sha256:9e68ea3b292e2544211c4c54a671b0e20a17055d8ef732dbdb77188431524758
Deleted: sha256:1a061c85ce09ea58d2caa2868e8f7900d55a27c20a590093567aca86aadd8ac3
Deleted: sha256:d5b3d3084815e95cdc72b200ec5c81aa450e60781cd7cd39e2d6d6c06c89427a
Deleted: sha256:bb7cefb236dd8bee10d300ec691842cc321d61fd361d16f6617fa57b6f9f6423
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:4cb97e29a64290f29cc142952358a1dbda898e7b4d103fdb8d721829b8eefe36
  Associated tags:
 - 20190701-060017
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-060017
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-060017].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:4cb97e29a64290f29cc142952358a1dbda898e7b4d103fdb8d721829b8eefe36].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3690

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3690/display/redirect>

------------------------------------------
[...truncated 208.20 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_17_08_40-5841956548398661571?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_17_15_18-9595244814895190416?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 834.855s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190701-000011
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:49b2308e16160caffc3a28844c416bcf4c15956f7d62fe6e2ab51ee01d96a099
Deleted: sha256:8d280d7b7add44d2ef30497b36cf2fa2fff994dfafbe5e8ec8219a107b9367fa
Deleted: sha256:cb725cf14df23395fbe76ca5c2b11f36aace779bee2a0175aeefaa055039331b
Deleted: sha256:42452f4a77b4f5cf1e99ebc515d59dd0894cd0701de4fd462c47de1db79938c7
Deleted: sha256:7bafa6927e6e1c0f19d1a76230c63357f4b8c02bfea9b2c21337416bb9f736de
Deleted: sha256:a2568e9505cd398d11fd2eefb0dc8e47376b198115d636f2dbf6027ba5dcd4d3
Deleted: sha256:eb65768305fc58bcff9ad53a6268dce30ddc3ee7556c485ee9f5542ca8d91fb5
Deleted: sha256:d73d873432e63818b7a5d73caa6ad20402b2a39c89ad02edcf5771b06a0c7244
Deleted: sha256:584d02c9b4159e7cb33c6af2c4235622a1027fecf8656f2d0a4009218886d54b
Deleted: sha256:ebe490b933ee76d0b730a58a8a8f6b079440c56648c2ed6058b02378f025fd7f
Deleted: sha256:deb85116ef90b17cbae518ede474e398aea13eaea1c55bac732534bb0b500561
Deleted: sha256:28eecb78824dec7e485c349d23ac82381cb16372d2a71802e7817d9bd2e32621
Deleted: sha256:ddd0019b44082950ab776340e30cbdedbe0c7ba60bd6e9666ebb338260ca49e2
Deleted: sha256:3ed29f0290107e00643f393121816f99cb78591ad6dd1811b0b0119614fa83f4
Deleted: sha256:857cf3e906bf9cdf5dc9860d861e9aad6fc9fa31b4ad80279326f1dd45697b54
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:49b2308e16160caffc3a28844c416bcf4c15956f7d62fe6e2ab51ee01d96a099
  Associated tags:
 - 20190701-000011
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190701-000011
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190701-000011].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:49b2308e16160caffc3a28844c416bcf4c15956f7d62fe6e2ab51ee01d96a099].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3689

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3689/display/redirect>

------------------------------------------
[...truncated 208.62 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_11_08_29-5253770123226336625?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_11_15_47-2551620475198357571?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 794.597s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-180009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2ef195c487ff4f93760706ef9cf8cc8840fc253fe7f32970c5f9d261d09867ad
Deleted: sha256:2234642c8b0ebc79d04d0a67fce68fd209fb8e8522668d0c85ca7b75a501ecc2
Deleted: sha256:421c4488c4f7fa3509fe9cde52a87dd8c3884151015bb384b04a0ea327455292
Deleted: sha256:0500e5f912c89dc27f70a8d6b9e03fbad8805f76cef2ad9046ad3ab677aef17b
Deleted: sha256:da435bf1bdde69e3fbb7762bd6ebda71bae8a5c7d8c2d0471396fe991be454ba
Deleted: sha256:66110d6ed9fb752f5cdb1233016342452ade9e2bb6f5e68dc599a602534e044e
Deleted: sha256:10a45dde525132ff02cf54790395ac73570d0deba359f2e982ebfdf0c82dddfd
Deleted: sha256:1ddb871996ca8f4a8cea6d582d8b91961479ac64456eb0050036b8b64128c646
Deleted: sha256:751b79b410104222339f1831d0c5c354b87ac495b3f9d33352b20ca4afe65b92
Deleted: sha256:edc8ff7d38802d1743ccce08626763ca113943c7d8ec676451f3e7c1047fa2f8
Deleted: sha256:040057ffaef94275b25b7cbf5e9c16d7eccdb881eb4f7732d644cfe5ed885d29
Deleted: sha256:0d6d07fcd6e7bd61cf345417368858f2accf455eea10e2ad788cd686f252c2bf
Deleted: sha256:e56c2b9bab469f0a6501ae47606b6e142e8672cc684f0505f4f153dcfa51e9a7
Deleted: sha256:abb2cef2260209c42e8f5c2c858594492870d8ef029f09dba0212bc00a78123d
Deleted: sha256:ada478ee3cfb5d90a081b778eaa49f511d7ab0ba9d6aa71f4414d809b55648c2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2ef195c487ff4f93760706ef9cf8cc8840fc253fe7f32970c5f9d261d09867ad
  Associated tags:
 - 20190630-180009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-180009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-180009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2ef195c487ff4f93760706ef9cf8cc8840fc253fe7f32970c5f9d261d09867ad].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3688

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3688/display/redirect>

------------------------------------------
[...truncated 208.59 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -349: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-291\x12\x04-289'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -349: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-291\x12\x04-289'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_05_08_29-17554109334196484419?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-30_05_15_32-13033466020270711883?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 793.842s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-120008
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8f9518f5bb50d528c19d7913d246217eded7ec1bbb65744947fe9afb56aff9d
Deleted: sha256:1b163bbe89884df7073d09945091a766bcf89388741ce97ca27fcbab45e1e9ed
Deleted: sha256:0c087c07c59ec8744d7f5618da21a8d0747e69cfae4a5111cf23efff2bc05c9f
Deleted: sha256:406fe45146c32019cb6d06834a3bb1b3d6183c0d12a437bd2004d70616622b73
Deleted: sha256:7060d0eb1a39571473025cfb32b3d2e35fe1340ae91bbfb4b92b74aa821a86ee
Deleted: sha256:a6f31e2b83be5fdee6f226feea3646f45c0a0e571e47130136fd92b93cfdd736
Deleted: sha256:613919269b4aa7bf4dcd96eded77d097e1c59541555b455ffd664f1dc1a1c9c1
Deleted: sha256:519116c01671b83f7b2dba28271471cc4e51a01074d647aeb24f74b59c92e5e3
Deleted: sha256:ee75ff856e5d7718f1c2cfcd05eecf55bf3e8a84f463a5d10f65c5e46c7cb8ff
Deleted: sha256:8b7bf44bf2f8dae26a624833cc6bf6989d0b45ea25bfa7afdb2327ec90fba416
Deleted: sha256:9af6b9bcc37ef10be5ee12203c92409a0fd14e33df6abe1076175d5dedc7069c
Deleted: sha256:77df6ec1cec8524a042ebada7c0125c133f482dce156f49cb4f0939e9819ead1
Deleted: sha256:807bd697f45914709eb43c37998232146bd8fbfab1826dcfb59cd43235c4fd5a
Deleted: sha256:37bd919f30d0cad77057f39f0019c3f9515fc0eb1ee8f4b83eaddb92a87e80a3
Deleted: sha256:287ee5eca295198539fa98161f9f979d6116ea20c2daca448c294f12a4135706
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8f9518f5bb50d528c19d7913d246217eded7ec1bbb65744947fe9afb56aff9d
  Associated tags:
 - 20190630-120008
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-120008
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-120008].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a8f9518f5bb50d528c19d7913d246217eded7ec1bbb65744947fe9afb56aff9d].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3687

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3687/display/redirect>

------------------------------------------
[...truncated 208.36 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_23_08_29-2002597637765519090?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_23_15_12-3517421203121459086?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 830.199s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-060009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a3855f12309982a94a3854ec8faedeb047708b0a2481177be6409b3b205c627a
Deleted: sha256:2ef8c98b05d39ed703b6e1792ac94a19bdb48f01be1b2b32f24888daff118d5a
Deleted: sha256:19def2b7b5e159a3dbb341a037420809c9390cfc97ce7a42cee837efed8b77c7
Deleted: sha256:80a22d48e3c03d925960ef3f4fa127c203ae292535c6e87c17dff05c3c3b51d0
Deleted: sha256:8babd2047cf465150deb6240b4c29be9aedaf0099593d533ac7817d04b4a4902
Deleted: sha256:d8a3099abd1e8b36e3644929d64c3fd636faef44c708c185239ae28e3fc1ca75
Deleted: sha256:a7db2e25b7510a284c2c761a4e8a34d020509f9ef9193be08553b2b53c12c5c0
Deleted: sha256:9de632465b7f936ea9e48c6fedc0c12e9d1b0aa4eb2e3cd17b283fd8bbfcd562
Deleted: sha256:71452f721f262f28f9cc217312fe9c46b793c76cc1b1b0d4d78b6692e50f29c1
Deleted: sha256:8e240679e39cdfe5c97e5ad9be1e7fb33e29210c64d0a9259d77a598af198217
Deleted: sha256:fdf54abd1ce3860d591527fdb2dc1806cc55aea8bdeb59648ad8e2101c6e8d5e
Deleted: sha256:e713d4a9f800ed4c86b7299ac5e0f82ca323b19bdc8159e5311fc0360699c701
Deleted: sha256:b36c712ceb90bc381425941a73d071189ca3335161ff5d185231fa08dd4f0c99
Deleted: sha256:0554b88af58f139a964546a9b02b5d1c0771b963b41265b48862521271740319
Deleted: sha256:548162a9c9bf73778e824bebb8417766ff76353547bba6dfc340ff775bab8c05
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a3855f12309982a94a3854ec8faedeb047708b0a2481177be6409b3b205c627a
  Associated tags:
 - 20190630-060009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-060009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-060009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a3855f12309982a94a3854ec8faedeb047708b0a2481177be6409b3b205c627a].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3686

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3686/display/redirect>

------------------------------------------
[...truncated 207.93 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_17_08_43-2807750522873383878?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_17_15_01-731854156366602360?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 758.946s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190630-000015
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9d9599530fec7001b13bdb4789d13e34a2eea04ce306a0b434fe171fae7cd54
Deleted: sha256:6142546cece44837bdf5a92aeae1dfca8caa1f2a3a0506c9d99ffb3f7965cd92
Deleted: sha256:ba6eb2393fdaa442f84c606261148e50b1f87b728cee5fd6494afb2a177b2aa0
Deleted: sha256:d52388dece9e3db1e782aab7c2034dfef41505602694eb613c902cb8daa39c96
Deleted: sha256:92ac2ea9638fe0717a7bfe387a74577c90a94013b763c5093737f3c1db138e5c
Deleted: sha256:fefa1e5bc6d0a2715903594fcfb16678c6aab860b7194f69b080b6d01d263baa
Deleted: sha256:bf68c0a1121336086d1ec8527190c288078fc18673e6476087626d011c14a9cf
Deleted: sha256:008b50e31791ced785cb62b0aba24b1bdccae362330588fe2e922bc7cf49caa5
Deleted: sha256:11b32a67bf53d0c04fb74c7fdfe3cb59fc7e1805518fd63b5c800f55f43f26d0
Deleted: sha256:0834423f3cd6d8c5cbb0cd99d89d2756227ce395d0341ae1d1d2561d0f6436fa
Deleted: sha256:c6d4df0327a5bd7812e4ca5e75c782723b3fdf7a4df04672c1d9578618f99446
Deleted: sha256:347eac95c7f7f4cad564ea296a7b81f8161c2eda0d3be2b189677b2407ec3718
Deleted: sha256:7eb0c4adf83ca4b4bf30a0cd33ff21669c6343c9db466c2f070c64a5df06fdec
Deleted: sha256:5f449e888efc951ee99ebb5410f14819b5fa8df7d4376a4ad1652d41f7f35e10
Deleted: sha256:f4502c6e146dbae32e768de61ad7d12c2f3cdcbf43bd85115555a83a2d57f135
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9d9599530fec7001b13bdb4789d13e34a2eea04ce306a0b434fe171fae7cd54
  Associated tags:
 - 20190630-000015
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190630-000015
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190630-000015].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:f9d9599530fec7001b13bdb4789d13e34a2eea04ce306a0b434fe171fae7cd54].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3685

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3685/display/redirect>

------------------------------------------
[...truncated 208.51 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_11_08_33-10210262735420284253?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_11_14_26-8911216390363898068?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 743.682s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-180008
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:0c5e96675360c16d6d262670c8f2be36573be1de30abe9278f1d92e78a73caf1
Deleted: sha256:473dc312a7f204b71d2a26635bdb54587825330d00114f78505fb15bd70ec401
Deleted: sha256:29349ba62bf7b494130cf0845db0414c58affb3956c4e4d1771298b8b13e0bca
Deleted: sha256:e0c3ac03ec759e65783053e349b7d3569a0b6a9b83a689caeb7a0309bc514672
Deleted: sha256:cda7b90483cf30f140fccdaf3b4b7c2da096f039f0f1df16ba9315e39a3f73f0
Deleted: sha256:9f0a955b7b793eb024cf8a95656d433b77a07f8bbd4f6239404bb731a3b55f72
Deleted: sha256:0228b77f253c2dbd3fe0cd923360ff2dd220cce2109aa2ff5cfa3948d1f2f3d1
Deleted: sha256:cdfd09f8e42fe79f1392f5466120abd96e4cb14851d95cc63b164f5292abac2a
Deleted: sha256:0869643c6384341b8e2ae6efd4dcf9938c1e5753d2f1d0b819b7c2b0264ec68a
Deleted: sha256:9d4cec50aaaeda2d6fa4e635caea367e4ab7de18557f0a2cdee3c7a53809b6d5
Deleted: sha256:597dc544910cd44f766bd294377811b74dcb6dd1644715db85916905f0382b5e
Deleted: sha256:d0dd8f82d343e10944a50578c6e7e7d1cdb18f541e520ef51bf2e11c55dbbc84
Deleted: sha256:aa79d0499b5380729b754c9a1d6968740fdeefde849fac9900e5a8302d995555
Deleted: sha256:9a830b1d6eb8373487e362111aadd522dd5b396138b3d12ad2fe12a192d33cea
Deleted: sha256:7b303a5b55df83cebe5ef4a0361ab5bb9c24cf424f5dbcd669443b02133b823b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:0c5e96675360c16d6d262670c8f2be36573be1de30abe9278f1d92e78a73caf1
  Associated tags:
 - 20190629-180008
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-180008
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-180008].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:0c5e96675360c16d6d262670c8f2be36573be1de30abe9278f1d92e78a73caf1].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3684

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3684/display/redirect>

------------------------------------------
[...truncated 208.23 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_05_08_33-8824590905822626715?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -128: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-106\x12\x04-104'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -128: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-106\x12\x04-104'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_05_14_42-9343493441872925995?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 719.867s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-120017
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a51212f609a165a248257c36158170a34a8633315ec2d57a959625927c370337
Deleted: sha256:dd49020b9fd39677d124c29fb33d9ecabf3e44019d4f397aaa6710c25874f59e
Deleted: sha256:eaed81ab7893df8ed74dbb39dd1fac7274de0015237c79e403c81aff0bed469d
Deleted: sha256:d3beb4c544c32e60fbfef36c2c03066ce49a18a6a553b3ba91d53368188cf030
Deleted: sha256:db26aa9d6290a7ec187c475bd96d11ef037d8c9a039eec2f47558b40c4bc0fdd
Deleted: sha256:c78e755d37e32b3aec2006bd7392ffe7475444998262b2710800c467e1b5598b
Deleted: sha256:ff7df4e9a4c3cf5488ad42e93b39b8a900aad69e860aeccfbb2ba4668a28f486
Deleted: sha256:7429b99f6f24414cfce93a6c9942471e68874fc0eaa335cb770bcdecdf1d22e8
Deleted: sha256:7863bc062b6ad4772013a54142461ae3fe7a637f9135597ca83fe9fe090f3ebb
Deleted: sha256:9e0270d481784bf23b1fb825d9d3e7a001177a39827e0fd9e8bc1a71fd0ea855
Deleted: sha256:535cec9318e4eda51b3b675dab1a90d262da8abf1ab477652b9fa41440cb08ce
Deleted: sha256:3c4b6050660e522871aa5a9c6403d8b8e3f5dbc03596a058710b2c5606b4357e
Deleted: sha256:3baa7c274babfece66853f3dbd5944eac3f4a3d850226cc7d5913f39eff8cf2d
Deleted: sha256:5df7baf57a71332914f8a57757a34c31ae8d1fe1ba8254528848385f0d7fb4b4
Deleted: sha256:05bc4ce73698a853731815326d0380ba5227e6127e25477ba53bb9350e172575
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a51212f609a165a248257c36158170a34a8633315ec2d57a959625927c370337
  Associated tags:
 - 20190629-120017
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-120017
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-120017].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a51212f609a165a248257c36158170a34a8633315ec2d57a959625927c370337].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3683

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3683/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6692] portable Spark: reshuffle translation

------------------------------------------
[...truncated 207.64 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_00_36_16-7846030029129310847?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-29_00_41_54-2307305936616572146?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 674.918s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-072819
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d0a6b86f3a3e4161e5cbd751a854393a230f732182370150216be145af69f95
Deleted: sha256:f0f5f214a3cfadc0cc52b475f63b78df17280837259beab5932eb295ff13e4ae
Deleted: sha256:35a57d4715d400c344a508a0a93f034f169c40ef2e4e24df3935b47694133e8d
Deleted: sha256:02cc1b070daba51364f8ddd7b0eeb73901c88069175cf13a44d8b3a8f4ccae1c
Deleted: sha256:7256da4bf673bf23dfeea4b9370b6e61b3af788852e073fecaf8b42c686b08bc
Deleted: sha256:3877d3cd040349f2737e9216b65fd6504bfb58d513d83df97bbee3b77d9049ae
Deleted: sha256:519f166840ccb70acd8ac6b665e29a0a3ede759a308078f662dfc639adc5983e
Deleted: sha256:cdcc57d4f7e98721d616bc8d689251869bb9361400b04a1f4fee3d4fbf02f8f8
Deleted: sha256:2dfd107509140c03924497144bb644349931a0a18ade2158aa31fca6465cbe57
Deleted: sha256:7abc7ebdcaa512fc3ca1f43c41a605cafa68e4ed3e6a16d83df4dff32cdd2249
Deleted: sha256:4a7ebd75b3857761a81a0b01c9e70b5555cbb98e0129390ef3ff780b43320f2d
Deleted: sha256:70f7ee1152e7eaf644f613d1e30c896afb3efa0781f71de73920ab8eebb680f6
Deleted: sha256:1870bf578cb8777bd2f3970b7d741e72c33885a75a3b634cb3013d6f2b189f30
Deleted: sha256:c1a78df410d49227ea9f5011d98195ec48b505b63648b3699cc1f98fa5a7515f
Deleted: sha256:1f6388ce565e7a0d7510cbe785d6f227de08935697370a2a56f83c572851c9a2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d0a6b86f3a3e4161e5cbd751a854393a230f732182370150216be145af69f95
  Associated tags:
 - 20190629-072819
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-072819
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-072819].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d0a6b86f3a3e4161e5cbd751a854393a230f732182370150216be145af69f95].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3682

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3682/display/redirect>

------------------------------------------
[...truncated 208.14 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_23_08_41-13780641267024281635?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_23_15_03-3709687458352547359?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 754.436s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-060012
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d44ada53d7b584b89d192a8a0872fe655d57d5ecf69f92dc5600f85865d90589
Deleted: sha256:ea0a9e49c6e684e0b6f9fc90cc39d0b4ca9b22d4c4a863d3e1556cd51cbfa795
Deleted: sha256:398d29eb4cf69f6c0aa873ae73704c7cc39dd8d45c58cb07662c18800a6fb420
Deleted: sha256:0e4e5d5afab969e47788b4bd4303affe382f492a70c94ee4c272f0452ea02e1a
Deleted: sha256:ed6aa976cd83699b4ba5e706566f281f5343a4e364ec59ad3a9a641ee0ed552d
Deleted: sha256:1491343a1f831f7f70092978789f18399a666b8b47ea8f9f21de331f34108e24
Deleted: sha256:ec389742a3c2f48d4223864b7fcdaad92d6b26c8eec39c0229ecda04c95dd55e
Deleted: sha256:217732b8b58a15d915d0a6abd2c3e4e017da62a091cf6684d39c401838add5e4
Deleted: sha256:43b9555f4971297aa33eace2e4ecb91a1a17276e8d364093839765be9cbf4509
Deleted: sha256:a65b0970faec1e74ada4c287f9e76f87caeb58524d24edf53fa17b2404761bed
Deleted: sha256:3ce1d68e112bd1cbe06ed8773abbd076da27a51ed08736bc8089d815f5e836bf
Deleted: sha256:bbd913a32e82b0c79e178e034de2e23133fd062d0da98468332b433d8a1797fc
Deleted: sha256:c45d71ccf3b8bfe66b898c18c48f9138adfd417e040ef07a511206c9b9ce55ea
Deleted: sha256:07f06458d651d63617dc0e063d7f4f7c8b39fa383cdeaebf0309d00da34712a5
Deleted: sha256:a868b980bb3a89700238b220ff56c1ebee68a99525710898d05c71eb4ed7590d
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d44ada53d7b584b89d192a8a0872fe655d57d5ecf69f92dc5600f85865d90589
  Associated tags:
 - 20190629-060012
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-060012
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-060012].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d44ada53d7b584b89d192a8a0872fe655d57d5ecf69f92dc5600f85865d90589].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3681

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3681/display/redirect?page=changes>

Changes:

[heejong] [BEAM-7424] Retry HTTP 429 errors from GCS

------------------------------------------
[...truncated 207.68 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -411: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-353\x12\x04-351'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_18_46_39-2108423113879928544?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_18_53_17-15804849240340790604?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 743.631s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-013927
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d6d847773c3c0c3c25596d7e4f092ae804987913479a7fdea30ac7708fda271
Deleted: sha256:8e9fd848475b5a4811ea5bc06b6f6abde6797bdee2068eeabe24f30ff75a0ab9
Deleted: sha256:d35fd2431b8378cf9166a1cfa905e579c6a1277f73ffdd108ecf008727cfc149
Deleted: sha256:568bc4504211e76741015413fbfabc7fc5907d9bf1d6e167b908a903ec9ee3fe
Deleted: sha256:22b887b7f7677f276ca3f41457c732abd5705fe58279056066a6b37c0f654e5d
Deleted: sha256:cd35e130abd4dbc5a0c2c55587bc4889805250e4c4b9eab91392af98162bfefb
Deleted: sha256:f4f57d019d478f2c0703dc7bc0f92366022ad4d48c6f912a2e9ce16440ef0963
Deleted: sha256:618c49f729992d3d3ebd0ce44612c9540c67528f5ead9f890af43ddb44b1894e
Deleted: sha256:75e00a9ef6ad0a1ddac65b29bf4f9f271510b324dd57119056f5c3a4473ccb57
Deleted: sha256:e56fa303b7f950875551c63221a613773c67105733a2e51862aa6e32be173ea3
Deleted: sha256:57f52cd1fc39a6446225c14c52d2ea3d131af50d7c68701c650d5302bf315ce2
Deleted: sha256:ed24240ae6af1fdd9f417522e863c5b868c06e0dd9104b462e35a06a14393740
Deleted: sha256:276c4f02e3c33d4900d6eb16051063984b0d8e8249fea1081b378e5d21f7ad71
Deleted: sha256:33514e1a9f69599598750415461236df32b57f18f7f8db703ac6a6a8c83d2eae
Deleted: sha256:1a9442a2c1523df94465d7bf5c98a5135d03b035dadfdf61e071f42a59a00490
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d6d847773c3c0c3c25596d7e4f092ae804987913479a7fdea30ac7708fda271
  Associated tags:
 - 20190629-013927
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-013927
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-013927].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:1d6d847773c3c0c3c25596d7e4f092ae804987913479a7fdea30ac7708fda271].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3680

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3680/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7589] Use only one KinesisProducer instance per JVM

[iemejia] [BEAM-7589] Make KinesisIOIT compatible with all other ITs

------------------------------------------
[...truncated 207.78 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -287: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-265\x12\x04-263'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_17_10_54-6122378245854945616?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_17_17_32-5116264454266151282?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 824.391s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190629-000235
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:9f1821f219e838515c8683905a78fc74bb729c38206afc025b05aefb3c332d8f
Deleted: sha256:84667fc415765595c1dd1b84fe039a29ddc9a8d2bccba62a0e5512da09093e0d
Deleted: sha256:c15bcb84931195b336dfa6194884d7ec24bd5e957cdb34f536fa4506fd035eb6
Deleted: sha256:d813b791dd21ee059e847c556e79855919148db2563b601ea201daa25f6050d7
Deleted: sha256:301df43c05f82904d6f3a2076a91927ca62ac476196fb22bfcb67b0986f0d433
Deleted: sha256:a87f8d77f78e46efbae9eb86b066ab3745997a58c22cad143a1b29eb82782832
Deleted: sha256:c5cf1681af6775fa82b1f9307b42e48c51907e645a68057f7c518c3984fd80a3
Deleted: sha256:9f861540808601eaa4629ffe24857f4d891cc82d3124c977307e267121f69fad
Deleted: sha256:e3aad566ee86167cc9ed62962005d92eb1194f32a0b5a74f47b7cac94a29ae85
Deleted: sha256:dbfb2336d9e052c6ffad449df60410fb1e2ff9ae48b84a0315f80978eed583fe
Deleted: sha256:afd72826fa181a94745a73e5f1b3301344b874b26757268d804a7bc56bc06336
Deleted: sha256:ac62eb7d85597edca78494a2c4c5aeb3285aa229144c9333a6012629198503a0
Deleted: sha256:c10d48a861433735bf82bfc4b7cfdfe238b407f5ad16e3f24320c1714aa063c8
Deleted: sha256:d647ac539f491372e209e9ab0efec064eb54833d7365e5679ddf0a49bf3ab270
Deleted: sha256:ee14d3a1541e2bc0fe6cc9f9c7bf2f21dc9b44001056b1a68af694806e34f00c
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:9f1821f219e838515c8683905a78fc74bb729c38206afc025b05aefb3c332d8f
  Associated tags:
 - 20190629-000235
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190629-000235
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190629-000235].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:9f1821f219e838515c8683905a78fc74bb729c38206afc025b05aefb3c332d8f].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3679

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3679/display/redirect>

------------------------------------------
Started by GitHub push by iemejia
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont
FATAL: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:52486
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.Request.call(Request.java:202)
		at hudson.remoting.Channel.call(Channel.java:954)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
Caused: hudson.remoting.RequestAbortedException
	at hudson.remoting.Request.abort(Request.java:340)
	at hudson.remoting.Channel.terminate(Channel.java:1038)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
	at hudson.remoting.Channel.close(Channel.java:1450)
	at hudson.remoting.Channel.close(Channel.java:1403)
	at jenkins.slaves.DefaultJnlpSlaveReceiver.afterChannel(DefaultJnlpSlaveReceiver.java:173)
	at org.jenkinsci.remoting.engine.JnlpConnectionState$4.invoke(JnlpConnectionState.java:421)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fire(JnlpConnectionState.java:312)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fireAfterChannel(JnlpConnectionState.java:418)
	at org.jenkinsci.remoting.engine.JnlpProtocol4Handler$Handler$1.run(JnlpProtocol4Handler.java:334)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR: apache-beam-jenkins-15 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3678

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3678/display/redirect?page=changes>

Changes:

[chamikara] [BEAM-7548] Fix flaky tests for ApproximateUnique (#8960)

------------------------------------------
[...truncated 208.48 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -316: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-258\x12\x04-256'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -316: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-258\x12\x04-256'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_15_50_55-4511359713742943227?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_15_57_39-9992977662838624095?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 843.908s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190628-224311
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:92540ad3cf1626cfe250b1537827367c1d08a519e1af86a09acdac508364fa95
Deleted: sha256:64f1b2279009c25571356337976415eccf22a27c758dcd1d7f5eecd23ab048db
Deleted: sha256:a07ae186502a55dc859760ac1fe28a5b2346753dc8e9ca5694e5b8aed5e358aa
Deleted: sha256:534f3cdce2e4165b21f3cfa496637b6770668d6c7984d11c59e27d94b4e1eaf9
Deleted: sha256:4d15b7d0a72ae09df1fd876389c231dfe047200090e80a094aa1c72448ae0f55
Deleted: sha256:1a8c5320e691ea281524704b74151254c057693233dc96d5af5ca4084a1147a9
Deleted: sha256:f590252a4eaea2c4270449d9ed32496678c9efcdfb1d394eabd370367b0bd5b3
Deleted: sha256:09cd2ce1b16a6322c43f8437256e1eca2fe56764b14f1091c5a9fbbabcbea53a
Deleted: sha256:c0538c3492b641838338704c0d5dd997164d0b279e02e3fa7fbfb1de3ffb68d5
Deleted: sha256:ef9df069eae14bf330677e1e7e493c34fb437ba0b0fc76f3e2dfd5f05d90827e
Deleted: sha256:ea7d9b6c33e62f0584fd28593fdd5e5950b6da20a189f982103dc68f6a35f255
Deleted: sha256:70a1322034fd406f6ee274634e12d64446a49cc99abb578e01eba12cbb5240d6
Deleted: sha256:4771b7cca0fe25161c1de40a101cb414e2c21bb02932af0dc925e2441d5a459a
Deleted: sha256:743d357d79233e53351949416be814bd7f08769ae6e61ded2fa0eec457892185
Deleted: sha256:40a345fa2ad20a00c96f75799c0506a40fa4c4d54dae1636a4e2e7eddd9ed487
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:92540ad3cf1626cfe250b1537827367c1d08a519e1af86a09acdac508364fa95
  Associated tags:
 - 20190628-224311
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190628-224311
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190628-224311].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:92540ad3cf1626cfe250b1537827367c1d08a519e1af86a09acdac508364fa95].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3677

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3677/display/redirect?page=changes>

Changes:

[dcavazos] Add Python snippet for Map transform

------------------------------------------
[...truncated 207.76 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_14_44_23-13509402300076266189?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_14_51_36-18291564045142457440?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 929.544s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190628-213720
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:c4952a035e05a16d8085569e7a3a7619e4fe0ead5bbd2b5a280846bc53532a53
Deleted: sha256:6c695cd9fb0d50f97bfa02f1e30cb950f5c686b7ba6911acf74850bdfc2bd73d
Deleted: sha256:8958f7fc71c21f9cc000be7440b8ca40771f5e70d52ac867aaaf1a9ddf625958
Deleted: sha256:9a370c89018512ed7a4d860ea4266a98ffc226f08913d32b3d8614b2d871edf7
Deleted: sha256:649194b63322f5f1059c37386cc6f7ef9fc00bf6b527546791aa4bb87d243a7f
Deleted: sha256:76550c69436e13f39b2420656925161573f8e0210314655e1de56b3ce9878ef0
Deleted: sha256:9b735b557ef4bd2be266148b50d46a7b2642e16947d64044d5ae3e46680af443
Deleted: sha256:443794f7cb067d54abe51817b9fcdb88ccf5329ee24c009494db10783519d09e
Deleted: sha256:b05d6232eaff83a53404b9cf2c2973c7e86c65ac30249d0882bbed4b1d3d462e
Deleted: sha256:8c1aca4998ceaeedf9d1623c6c2b61519b0f723e889b4d9d4ba00fdebff053bf
Deleted: sha256:4956ff9d67e84a25b96f99b906b15a8c69d4cec6f672e70166c6ef5cb778ce59
Deleted: sha256:484190a83d6aac3105b5441ff6c8269db8478f0b0a959408bf19d8f4823da742
Deleted: sha256:6a1f13b136dc22ead9f15dd3a013972aa2a13b4d4b0c045d4e3c4076dac7f7b7
Deleted: sha256:dbecd0f4f5bb93691af0ef322407cc4bdc6273ded0f113778f4c8f041b21ac2d
Deleted: sha256:ecabfc67f88a840f1ca63c28db10056c4f703d49f48e9119d7becf8e731368d3
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:c4952a035e05a16d8085569e7a3a7619e4fe0ead5bbd2b5a280846bc53532a53
  Associated tags:
 - 20190628-213720
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190628-213720
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190628-213720].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:c4952a035e05a16d8085569e7a3a7619e4fe0ead5bbd2b5a280846bc53532a53].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3676

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3676/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-7547] Avoid WindmillStateCache cache hits for stale work.

------------------------------------------
[...truncated 207.47 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -317: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-259\x12\x04-257'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -317: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-259\x12\x04-257'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_13_55_42-14492816578031927032?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_14_02_45-2925156432989112261?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 794.974s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190628-204556
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:03d37041828c6f6fba9fa6ff01b12fd1277608fb6a7f60520095f5c75340e60e
Deleted: sha256:92d26c89798a42e1a99d6a6d5796a219a7858176968b3a40d0697dcdffe229e0
Deleted: sha256:6b444eb75fbf057771188962533cf46ba7b5f1c2b8015850e0f2f44fa29c6c06
Deleted: sha256:048bc0b6b243261184ae83282997844ec1b21558f4d15f7b74db0cd52bf4783e
Deleted: sha256:419e1ddba8c9a6062df431c2214e551f7ce80affdfb62afb2c5e18b4f5dcd170
Deleted: sha256:72cf9d67d8cd084f3c504d5dcbe562f0fadee49736ffdbfe9a36c3dc887785ad
Deleted: sha256:636137f0174cda0a084902d6a0035fdfbbf2a0d817527433d22aba5d0f755553
Deleted: sha256:ef627209fb9b0afaba7a64a77d1311bca2d3cffdf2f17f5b0a7b39c29d90382e
Deleted: sha256:a4e3ff5faa4caf0cf80602dac520db6230fbbb2cc12343848caf3105ddd85795
Deleted: sha256:5807202d9058b09f36a90ef4d31ac6adb480069c2a3bfd3e7a17cb855826f3c2
Deleted: sha256:9323f192b6b2f53ceca6aa4b6e65a8d99624b74c68652372e6dfd480fdab8a79
Deleted: sha256:fe79ac05bf3cace9448d3f0a7ea8a063c2326fad5cb8c92e5ca4e17fc0f52fd5
Deleted: sha256:5e2c508f180454f8925cd3ee2160577dc66a6ca0b7348241e2d925e58dc4a107
Deleted: sha256:92a27b748928d9dc6645f58c39cd4e98efce19caacf75bc0979a8dd635f3d83f
Deleted: sha256:6277231b7d82b968d527806b64745a34b834e56749ae870e92c43e6453463670
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:03d37041828c6f6fba9fa6ff01b12fd1277608fb6a7f60520095f5c75340e60e
  Associated tags:
 - 20190628-204556
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190628-204556
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190628-204556].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:03d37041828c6f6fba9fa6ff01b12fd1277608fb6a7f60520095f5c75340e60e].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3675

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3675/display/redirect?page=changes>

Changes:

[zyichi] [BEAM-7586] Add Integration test for python mongodb io

------------------------------------------
[...truncated 207.82 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_11_31_42-1316291474126148002?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_11_38_46-13358278940507584586?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 905.393s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190628-182403
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b96d2eb467c85956cb7fac4d765a855a1b3a9036d4b8cf0886015764baf034e6
Deleted: sha256:58c47937b3559b625f5ad172e6b0d2ac9a66c3da89e31881689b8a2e9338dda3
Deleted: sha256:0627f94a4fdeda32d8d24e62c57e69923b135a2d1fe9970a695842fbbbd1f755
Deleted: sha256:0b2266fa20f916093e252a15d53941bf59b71329bad39df816d76184912a8d08
Deleted: sha256:c7adb9382b3ab8d194725afe4e8a5b6fac6183d3db0676a56ce69e476fdd974b
Deleted: sha256:2c3ec8de534fc249a71e72192f404f891eec4336da7e2d5f93d95e406cadc3df
Deleted: sha256:e60f163ca2a94458efce45b645e0248ca03bdf570a2fb9fb2f91014fec9e31df
Deleted: sha256:c361e2cfcb627afb9ffaaa42a8af53423e1a702f327df363733fe637a60520c4
Deleted: sha256:b92f29b6e48a17de2cfe9132760a5724013d6f0ef40d3f013a99c90310d53c2b
Deleted: sha256:108358d090487f48ea9533c37a6a9867e572e242bfc809439773c27f7c76098e
Deleted: sha256:729796c8d9153a045fa5f816536317f2574ccfe43a0c0255699a4afeb268226d
Deleted: sha256:4d107600c54291468569d86b7a7017d39f324877e4e93ff7cdc91e5974781653
Deleted: sha256:087a868e9581b570b8eb0d91477c53fa85f153e375264846aa4570b8fed3821b
Deleted: sha256:8b9f6ce18c4d4387c728b8307007f0d43188609879459f0a1c31fb87081e1ab1
Deleted: sha256:e882e79536cf5cf80fc6c303dd75abf36c74af3bf2e34b212c7e61986772a2ff
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b96d2eb467c85956cb7fac4d765a855a1b3a9036d4b8cf0886015764baf034e6
  Associated tags:
 - 20190628-182403
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190628-182403
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190628-182403].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b96d2eb467c85956cb7fac4d765a855a1b3a9036d4b8cf0886015764baf034e6].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3674

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3674/display/redirect?page=changes>

Changes:

[juta.staes] [BEAM-7326] add documentation bigquery data types

------------------------------------------
[...truncated 207.77 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_11_09_39-16750614922441298193?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_11_16_42-15259098329786254000?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 855.882s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190628-180103
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ceb269d18f554eb6a4443c4f5e9dc219047b5fe0871120e5794965534547a437
Deleted: sha256:96ec564a67a5396874e86d54925555e757c3fefeda4dffcc881fa8a59fe367ee
Deleted: sha256:b668f648794940b78b11c1a23b8345c0fa815e3e0b2d9245c61f4efd0defd542
Deleted: sha256:628958688c3ef283f7441de283474190098545f632a0044fe508aa8cd61ea8fa
Deleted: sha256:4947f092e7879e80571df7deba44b38e5debed38b45287fa9fc52722d4038b24
Deleted: sha256:8363ff93c500e0e1ef2e9701cb99c5ed75f91204aac0ec0270f71c7cf64c16e3
Deleted: sha256:bab6458ea2410e27e3e85f4537efebf3b8009fac6ae8e5fcb8f03542ae722c91
Deleted: sha256:544cee85493a69f9a6ec823553522b1b7645ede82cd4366aaf534d2152d05e96
Deleted: sha256:efd4863b8f78814e6b8e735e63839385f598fe037a081e57f4f924a197f492bf
Deleted: sha256:97f2fc798b996b8bb065aa877dc7f30ae1d4f4bfc8445a3b593dbe0a15ed4fe8
Deleted: sha256:9fcaf2fe3df504d9364392096af973024f6ba18edc8d582beeb16d92b8b7fc62
Deleted: sha256:c05b5eb140d71dada61ee49672263577a2310cadb243e2ff5d407a87c949f3ec
Deleted: sha256:16b83a29dbe765e66abaf7b21787a0039f6d3beb2dfad3417b72c10159b3d12c
Deleted: sha256:08b096a22adcb4738b622c2fedfd0617f30b64be75976303e49631c1a6ba1c97
Deleted: sha256:11df8c2a86ebfce03a7a25ff01f0acf772d8e4fe0419c7b8a1cea57daaaa298f
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ceb269d18f554eb6a4443c4f5e9dc219047b5fe0871120e5794965534547a437
  Associated tags:
 - 20190628-180103
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190628-180103
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190628-180103].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ceb269d18f554eb6a4443c4f5e9dc219047b5fe0871120e5794965534547a437].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3673

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3673/display/redirect>

------------------------------------------
Started by GitHub push by chamikaramj
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont
FATAL: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:44718
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.Request.call(Request.java:202)
		at hudson.remoting.Channel.call(Channel.java:954)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
Caused: hudson.remoting.RequestAbortedException
	at hudson.remoting.Request.abort(Request.java:340)
	at hudson.remoting.Channel.terminate(Channel.java:1038)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
	at hudson.remoting.Channel.close(Channel.java:1450)
	at hudson.remoting.Channel.close(Channel.java:1403)
	at jenkins.slaves.DefaultJnlpSlaveReceiver.afterChannel(DefaultJnlpSlaveReceiver.java:173)
	at org.jenkinsci.remoting.engine.JnlpConnectionState$4.invoke(JnlpConnectionState.java:421)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fire(JnlpConnectionState.java:312)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fireAfterChannel(JnlpConnectionState.java:418)
	at org.jenkinsci.remoting.engine.JnlpProtocol4Handler$Handler$1.run(JnlpProtocol4Handler.java:334)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR: apache-beam-jenkins-15 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3672

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3672/display/redirect>

------------------------------------------
[...truncated 208.04 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -289: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-267\x12\x04-265'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -289: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-267\x12\x04-265'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_05_08_21-12196534193889079541?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-28_05_16_00-11117629591859549030?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 840.447s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190628-120009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:3f141ca59522e9aa2a55b34d548b542c17ab818a925a7416d42dbf9877b702a8
Deleted: sha256:318e4338f0d03bae0a286644e75f1cca122eadfb93758c0f5c17ef02a2735da1
Deleted: sha256:9a2a71db5eef94941d9a147c9740b3dfedc5a2d7405c1d43592633840aec8f32
Deleted: sha256:57e6fc39df0c745f4a09f9b29604248cb4a7f388b208dc83041654f24265eafa
Deleted: sha256:4f84bb8d6beeaf23f7770b6e603216249e47f0fb6a4c4bb5c9dbd3c6b5c4f61f
Deleted: sha256:118bb3e656b2143dd73baaee708e5032140b7d8c47bcaf18bf5590abfba876bb
Deleted: sha256:c62098e281ff6329b45e808297f148e55505971330a1c98e00412f719c041c01
Deleted: sha256:6a62b540e344980b66b66605605a8c2a3bb55dec46607502e86348a1250674b4
Deleted: sha256:30ab62a9882a94d9d45ea638941dcfa62b982f6ec7cf9ecf55dcf4384d34e3dd
Deleted: sha256:7febb5db72f45de914ebffb43a578f6595ba08596e7bdb524f8fb2a227683165
Deleted: sha256:1c525b310c800cb81aaef441b19f4c26178fa6443d2f23dc9bfbc01f0abbffa0
Deleted: sha256:732376deb666914ced725d5cfc81356d1547f2c51bbc219d52dc8576d76d59d0
Deleted: sha256:ca9e83a08d2b2b5845b514e112a94996483d7ec4005f3aa49f9d476c06bae8d6
Deleted: sha256:47e1780bc5eb7a31f81219bfa964ff02ad1fe4c82fd21b58ab897fb642faaeca
Deleted: sha256:2b5514239fa97d3ad2a8def5cacd4e180615a483d32925a5b5ca14854ba89de2
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:3f141ca59522e9aa2a55b34d548b542c17ab818a925a7416d42dbf9877b702a8
  Associated tags:
 - 20190628-120009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190628-120009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190628-120009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:3f141ca59522e9aa2a55b34d548b542c17ab818a925a7416d42dbf9877b702a8].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3671

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3671/display/redirect>

------------------------------------------
[...truncated 207.79 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -377: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-319\x12\x04-317'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -377: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-319\x12\x04-317'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_23_08_32-3911881753007080056?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_23_16_11-15204935944568405686?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 819.525s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190628-060011
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d7cdd7e38a982dc48c66cd038c2baae1007288e5b239a070b43725e2c9d702c6
Deleted: sha256:1551b50e420a22c05ba73c42b9b9a087a23603ba4299f6e341c60e4073d7945b
Deleted: sha256:40bc7a58b802d3bb8fcc2279a81e11b4372cba7aaf5e1b611e727a78b3561be9
Deleted: sha256:78e3cd6af39469b415fd367ce18e577ac82dc82a0bd6027800af67e2d304d6e3
Deleted: sha256:47c40b5d04d117bff88f90de1ebfa768ab122e6e16174189c7be09b955990e42
Deleted: sha256:7ada633df41a5144e49cc9139d669491ee7bc49047c882dc9a02f13683bf8e13
Deleted: sha256:fb93876aff40945b3970b7bf64771d2fcbb611fb2773f8d59ae9834622946318
Deleted: sha256:a2058a4827179beb74eae157a0c04619c619e1fb88c041fae3aada7f17000c3d
Deleted: sha256:17997ccdf21834292e6d54bdd577dd6ce2b75a44dfae0580a39c0453efe56a16
Deleted: sha256:f674d2558bfe056a8d8c6a5f12887e828a30402b7592612195a806cec3cb42e8
Deleted: sha256:b83b1a65b6a6511a8ae3d323a87ee59474c00bf4a5121780dd9a96b49f66922a
Deleted: sha256:773f1155d0e323dd7d5acee1d6593c46b5fb8c7d3b6612ff4584588252da391c
Deleted: sha256:0f5d12f77a11851cdc2d0c432839ba4c282811fbd3d5e0d7753c04885e414bf8
Deleted: sha256:aabc6701101e05d8e530bfba1c04360a21740666f9dbcf8ba89845d01769ecef
Deleted: sha256:cde15551497fd65c77b5ab4bf243b18a0156974df1b1f9fe93ec3215013742bd
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d7cdd7e38a982dc48c66cd038c2baae1007288e5b239a070b43725e2c9d702c6
  Associated tags:
 - 20190628-060011
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190628-060011
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190628-060011].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d7cdd7e38a982dc48c66cd038c2baae1007288e5b239a070b43725e2c9d702c6].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3670

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3670/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont
FATAL: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:43370
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.Request.call(Request.java:202)
		at hudson.remoting.Channel.call(Channel.java:954)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
Caused: hudson.remoting.RequestAbortedException
	at hudson.remoting.Request.abort(Request.java:340)
	at hudson.remoting.Channel.terminate(Channel.java:1038)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
	at hudson.remoting.Channel.close(Channel.java:1450)
	at hudson.remoting.Channel.close(Channel.java:1403)
	at jenkins.slaves.DefaultJnlpSlaveReceiver.afterChannel(DefaultJnlpSlaveReceiver.java:173)
	at org.jenkinsci.remoting.engine.JnlpConnectionState$4.invoke(JnlpConnectionState.java:421)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fire(JnlpConnectionState.java:312)
	at org.jenkinsci.remoting.engine.JnlpConnectionState.fireAfterChannel(JnlpConnectionState.java:418)
	at org.jenkinsci.remoting.engine.JnlpProtocol4Handler$Handler$1.run(JnlpProtocol4Handler.java:334)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR: apache-beam-jenkins-15 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3669

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3669/display/redirect?page=changes>

Changes:

[chamikara] [BEAM-7548] fix flaky tests for ApproximateUnique (#8948)

------------------------------------------
[...truncated 207.50 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_13_45_39-11238739612809345171?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_13_53_02-7834355204156153515?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 883.876s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190627-203815
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:049446a4d8135edec5b15510aaa51be2c8cb35327ec5421b6c726dd441a415e3
Deleted: sha256:5bb10ada796e1b214f55c177a5904c1f57bde184c1f9f30eebe4ea3b94f59bb1
Deleted: sha256:62139dc006baa38295a9cc112b59954a1933c872ba63f62128c121da4a582f5d
Deleted: sha256:2e40780060134224301def42c2748d4d078900c4e95074bd10de47667f9261de
Deleted: sha256:a2573bee6e928d52d437d4a1a901474bcc8af012d6d266987fd8e970f1ffdbed
Deleted: sha256:7358d0fd9ec88a55e3e34f47125b56259cb2d15e3ee92ad4a1645ae4a32964ff
Deleted: sha256:121e2b465ca5420b0c61b3b83c28d27083c0942cfd9aecfeb3c138e1efcee514
Deleted: sha256:8016a9751fd3a0202027d58cfea38493ae3cca433690af361b6d3ea96cbbfaa5
Deleted: sha256:3153bb34f06e09afac5f3f913d9b907394dfa94af363855fb75cc398d78a686d
Deleted: sha256:f62a3c89e1c2fed73f7d56256b22bc82691ad19317b9ee34d6939a1a7d27e7c2
Deleted: sha256:a8265b681bcf5e4ff128c2d870f75b286398455b8a2c1f73ed56810c4c05a154
Deleted: sha256:36d83dec23b11a6ca4360874edfa1ce593c4a6c75322d8215385698ca64e93c4
Deleted: sha256:77191952a3a4483bde71a2fb14435ebc280f86cb7f50afd6ebcfe43fd0fb0f0e
Deleted: sha256:3802c3bc185c5d81ada0397aa8efb0d075dbec8e204005089b661b82e44f631d
Deleted: sha256:e72e646bbb6d36e5f31c7f6be1c66477285f043c5b196736adede5f6463491cf
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:049446a4d8135edec5b15510aaa51be2c8cb35327ec5421b6c726dd441a415e3
  Associated tags:
 - 20190627-203815
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190627-203815
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190627-203815].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:049446a4d8135edec5b15510aaa51be2c8cb35327ec5421b6c726dd441a415e3].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3668

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3668/display/redirect?page=changes>

Changes:

[valentyn] Match Python 3 warning message in __init__.py with the one in setup.py.

[kedin] Spotless config update to include java files only under src directory

------------------------------------------
[...truncated 207.94 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_13_20_16-7488411476495979409?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_13_27_40-15728511241086611959?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 900.270s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190627-201140
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:008d2071a95931a8966a5798ff8837a3de9b27c4b87661594345b805ec4bdf79
Deleted: sha256:69e6fc5b3a4d490784505bfa33dd916aedc02eec19d4ebc1006605c5dcb83bc0
Deleted: sha256:e3ae9234a907cdef98a7e789dea38885daa25278687d75bffd023d23ca73c24f
Deleted: sha256:c463fb87c2b5c443b7b97a8a9f26e5d29d2d255b6e64086a1de1505140683852
Deleted: sha256:89b40b6f5d32d2d3add4b099f9b030cad9333930b611050c07e53607a8b0ede6
Deleted: sha256:988c572456705b5dcccb36982689a487f4526e9d5a231dccf305c2be9c4eb48d
Deleted: sha256:eb3049ae219860146263ad4fd04aef24dd5252d3138efd14982d5bf7cfcf2f47
Deleted: sha256:37df68b287e567c08752966f6e70950df9fe3751a3c19e3fff42a55e685e65e8
Deleted: sha256:869a70ddc4d501b58ed73dec9f14d5921cdbdd8580f30a06c8643c52abb3f07c
Deleted: sha256:5ff991a808a972a77395eb4388f5332c731c8b6793c6aed86f92705c2ae20578
Deleted: sha256:b2aeb8219d38c3b764f9b8229cd680cea9ea6a1b6a2e7f222cef53fa5dcde8cb
Deleted: sha256:3b475df21d98171b06142b9117aa3818ab6fb2a67bded91aefcda30a387d7610
Deleted: sha256:ead10fbd406878da93e18c84e87e7a621218c9350f9c8ce18529f49c4ca79a67
Deleted: sha256:c578b3da2d447d6e3671a473c1c5bff78d1915ca52e11c468fb5d8abcad22886
Deleted: sha256:4a9d1eeccf82344ab94760b3b8fd8ea2d10915d3af6b81bca5f7e9d01ba5ac33
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:008d2071a95931a8966a5798ff8837a3de9b27c4b87661594345b805ec4bdf79
  Associated tags:
 - 20190627-201140
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190627-201140
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190627-201140].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:008d2071a95931a8966a5798ff8837a3de9b27c4b87661594345b805ec4bdf79].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3667

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3667/display/redirect>

------------------------------------------
Started by GitHub push by akedin
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
FATAL: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.Request.call(Request.java:202)
		at hudson.remoting.Channel.call(Channel.java:954)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
Caused: hudson.remoting.RequestAbortedException
	at hudson.remoting.Request.abort(Request.java:340)
	at hudson.remoting.Channel.terminate(Channel.java:1038)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:172)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.NetworkLayer.onRecvClosed(NetworkLayer.java:154)
	at org.jenkinsci.remoting.protocol.impl.NIONetworkLayer.ready(NIONetworkLayer.java:179)
	at org.jenkinsci.remoting.protocol.IOHub$OnReady.run(IOHub.java:795)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR: apache-beam-jenkins-15 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3666

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3666/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3665

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3665/display/redirect?page=changes>

Changes:

[markliu] [BEAM-4046, BEAM-7527] Fix benchmark with correct Gradle project

------------------------------------------
[...truncated 207.40 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_05_09_27-6179628939711143848?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-27_05_16_30-9754762587380074176?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 819.149s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190627-120217
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:788182637a3cd8ecd28517e34489030de4a06334672b33a697982a967abc9098
Deleted: sha256:a75f90651347db029bd8ce87ebf3d56a62aea9b44f78071d55da83bf0e9acc18
Deleted: sha256:5f6b378a7679715cfa3938e9cb2fa16b161d3eb7fc75b965a1d5a5df17d1a10c
Deleted: sha256:9fd2425ed2f7bc80d8d102cac3502c80afeacb59ad868ee79a359a1a56a9aa66
Deleted: sha256:646124e5b868cb36e4ac563829615a287574d7079392dbc34b73d987a8f0d6e8
Deleted: sha256:2a1cfc154eac0274be43829bad50f0e62a3dd624a88078f5b1aa45dda70606f3
Deleted: sha256:e1167596505fdbec6cf5e6e5d0e4dca4b62af58d53288084f1bf0893f06b978d
Deleted: sha256:06d15216e1aea32bcb2f503ba275d4d1aab7f2bb47bbae23676185e610502b9c
Deleted: sha256:473225d75a5bfbf448643680e2812a3e7731fdb4fef34126afe7b7c41ce27826
Deleted: sha256:f79b776828ed4db8db77aef02cdd1b0b29e8da85054c4bb7cbc7f34347c17b79
Deleted: sha256:cd7c72d3f19a60037dcc4012fbf09138d916c19673ba7029764f91ffb59c1e83
Deleted: sha256:05fb574af522e8835748d7adfbbacb96547e708a86d80e85b943f1c2a9e273ac
Deleted: sha256:fb416e0b980ebda94e187f3465c224a753d9376a58492df6795ed61f248bd7af
Deleted: sha256:4ee625dab7a4f7335861a72cbf98eca7e00ef55782c436d3d4839f5dc44d1c0a
Deleted: sha256:de15c4de8c87d665220b379924b031f294fdbe7ecbc1e2445149f82c6d83f109
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:788182637a3cd8ecd28517e34489030de4a06334672b33a697982a967abc9098
  Associated tags:
 - 20190627-120217
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190627-120217
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190627-120217].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:788182637a3cd8ecd28517e34489030de4a06334672b33a697982a967abc9098].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3664

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3664/display/redirect>

------------------------------------------
Started by GitHub push by lgajowy
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3663

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3663/display/redirect>

------------------------------------------
[...truncated 199.67 KB...]
copying apache_beam/testing/benchmarks/nexmark/nexmark_launcher.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark
copying apache_beam/testing/benchmarks/nexmark/nexmark_util.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark
copying apache_beam/testing/benchmarks/nexmark/models/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/models
copying apache_beam/testing/benchmarks/nexmark/models/nexmark_model.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/models
copying apache_beam/testing/benchmarks/nexmark/queries/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/benchmarks/nexmark/queries/query0.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/benchmarks/nexmark/queries/query1.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/benchmarks/nexmark/queries/query2.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/data/trigger_transcripts.yaml -> apache-beam-2.15.0.dev0/apache_beam/testing/data
copying apache_beam/testing/load_tests/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/co_group_by_key_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/combine_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/group_by_key_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/load_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/load_test_metrics_utils.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/pardo_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/sideinput_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/tools/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/coders_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/distribution_counter_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/map_fn_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/microbenchmarks_test.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/sideinput_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/utils.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/transforms/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/combiners.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/combiners_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/core.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/create_source.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/create_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_combiners.pxd -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_combiners.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_dataflow_distribution_counter.pxd -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_dataflow_distribution_counter.pyx -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/dataflow_distribution_counter_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/display.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/display_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/dofn_lifecycle_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test_it.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/ptransform.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/ptransform_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/py_dataflow_distribution_counter.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/sideinputs.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/sideinputs_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/stats.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/stats_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/timeutil.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/write_ptransform_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/typehints/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/opcodes.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 71, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1353, in cancel
    self.job_id(), 'JOB_STATE_CANCELLED'):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 634, in modify_job_state
    self._client.projects_locations_jobs.Update(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 812, in Update
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
HttpBadRequestError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-26_23_07_37-13662278027793972238?alt=json>: response: <{'status': '400', 'content-length': '316', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Thu, 27 Jun 2019 06:17:40 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 400,
    "message": "(2d6844b327d17c60): Workflow modification failed. Causes: (b4f0d737a5a513b2): Operation cancel not allowed for job 2019-06-26_23_07_37-13662278027793972238. Job is not yet ready for canceling. Please retry in a few minutes.",
    "status": "FAILED_PRECONDITION"
  }
}
>
-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-26_23_07_37-13662278027793972238?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 72, in run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 95, in wait_until_in_state
    time.sleep(5)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 1508.756s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190627-060011
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b485adc3f6be14285dadd108ba015e2950e065c3cb22da630b990252ca732d85
Deleted: sha256:f28e912e5da59fc11e58406e3a3192342075060d8db721b0e19088f5a2490da6
Deleted: sha256:4ebb2ad4db115f1f1ec26c1b6685d70a2ae26379ce3c29bdabe9d319d78f2ae8
Deleted: sha256:36f616d641648907e8eddcc4db830fca0db9a4aaa787bd5ae4cafc6293755372
Deleted: sha256:bcdc3ae74e348816b4f9a97407ed27996d33f118199728e5fe4ee47164040966
Deleted: sha256:79b93673b8c334ade8e7df67c2f81764e70e65323f23794b9fdfb5f4a6a6a6ac
Deleted: sha256:21f7677ac1ca5c64afd96b85540f35e990734d8f5d64fe3e9a197a2a41946d34
Deleted: sha256:848e928f9e13c879e8c0b4b62dec16fca6f1577d85a618c60f88bda915a663eb
Deleted: sha256:36bc6011cc70f460feb2dc038808da9f8291fa5bd5ee4c2154c1bf988125a2bf
Deleted: sha256:8dd156b84cb50509eb052f0aedffac1fd018096e1d8bc0688fcc4b78496810a0
Deleted: sha256:056f2bab0f9b10049742678b2964b0704151a30a3148e254c9ebef695d20809b
Deleted: sha256:ea45b5a811e56346cba26b43af209cdc9128b7c4b8ca5e3590f99866214a95ab
Deleted: sha256:91cbc588afc90a610e20021b26cade92dfe63fc81cd413ae5ec8c246dbeb3e23
Deleted: sha256:858178ff7336f1b0e0f03087518b133614a1ab73e3604ef6219a43d18c5e391c
Deleted: sha256:1aeba0f93721c8d575f18c780965318218356d4479c38d3b649a9385009657aa
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b485adc3f6be14285dadd108ba015e2950e065c3cb22da630b990252ca732d85
  Associated tags:
 - 20190627-060011
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190627-060011
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190627-060011].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b485adc3f6be14285dadd108ba015e2950e065c3cb22da630b990252ca732d85].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3662

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3662/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
Retrying after 10 seconds
Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.FilePath.act(FilePath.java:1072)
		at hudson.FilePath.act(FilePath.java:1061)
		at hudson.FilePath.mkdirs(FilePath.java:1246)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
java.nio.file.FileSystemException: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>: No space left on device
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
	at java.nio.file.Files.createDirectory(Files.java:674)
	at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
	at java.nio.file.Files.createDirectories(Files.java:767)
	at hudson.FilePath.mkdirs(FilePath.java:3273)
	at hudson.FilePath.access$1300(FilePath.java:213)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1254)
	at hudson.FilePath$Mkdirs.invoke(FilePath.java:1250)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3661

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3661/display/redirect>

------------------------------------------
[...truncated 212.17 KB...]
}
>
WARNING:root:Retry with exponential backoff: waiting for 3.31958049015 seconds before retrying submit_job_description because we caught exception: BadStatusCodeError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '503', 'content-length': '122', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 18:11:44 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "The service is currently unavailable.",
    "status": "UNAVAILABLE"
  }
}
>
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 585, in submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 657, in Create
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 402, in _MakeRequestNoRetry
    check_response_func(response)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 223, in CheckResponse
    raise exceptions.BadStatusCodeError.FromResponse(response)

ERROR:root:HTTP status 503 trying to create job at dataflow service endpoint https://dataflow.googleapis.com
CRITICAL:root:details of server error: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '503', 'content-length': '122', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 18:13:51 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "The service is currently unavailable.",
    "status": "UNAVAILABLE"
  }
}
>
WARNING:root:Retry with exponential backoff: waiting for 5.33865705988 seconds before retrying submit_job_description because we caught exception: BadStatusCodeError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '503', 'content-length': '122', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 18:13:51 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "The service is currently unavailable.",
    "status": "UNAVAILABLE"
  }
}
>
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 585, in submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 657, in Create
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 402, in _MakeRequestNoRetry
    check_response_func(response)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 223, in CheckResponse
    raise exceptions.BadStatusCodeError.FromResponse(response)

ERROR:root:HTTP status 429 trying to create job at dataflow service endpoint https://dataflow.googleapis.com
CRITICAL:root:details of server error: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '598', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 18:14:55 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "Quota exceeded for quota metric 'dataflow.googleapis.com/create_requests' and limit 'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for consumer 'project_number:844138762903'.",
    "status": "RESOURCE_EXHAUSTED",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.Help",
        "links": [
          {
            "description": "Google developer console API key",
            "url": "https://console.developers.google.com/project/844138762903/apiui/credential"
          }
        ]
      }
    ]
  }
}
>
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
ERROR:root:HTTP status 429 trying to create job at dataflow service endpoint https://dataflow.googleapis.com
CRITICAL:root:details of server error: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '598', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 18:17:01 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "Quota exceeded for quota metric 'dataflow.googleapis.com/create_requests' and limit 'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for consumer 'project_number:844138762903'.",
    "status": "RESOURCE_EXHAUSTED",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.Help",
        "links": [
          {
            "description": "Google developer console API key",
            "url": "https://console.developers.google.com/project/844138762903/apiui/credential"
          }
        ]
      }
    ]
  }
}
>
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 476, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 538, in create_job
    return self.submit_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 585, in submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 657, in Create
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 402, in _MakeRequestNoRetry
    check_response_func(response)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 223, in CheckResponse
    raise exceptions.BadStatusCodeError.FromResponse(response)
BadStatusCodeError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '598', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 18:14:55 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "Quota exceeded for quota metric 'dataflow.googleapis.com/create_requests' and limit 'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for consumer 'project_number:844138762903'.",
    "status": "RESOURCE_EXHAUSTED",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.Help",
        "links": [
          {
            "description": "Google developer console API key",
            "url": "https://console.developers.google.com/project/844138762903/apiui/credential"
          }
        ]
      }
    ]
  }
}
>

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 476, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 538, in create_job
    return self.submit_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 585, in submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 657, in Create
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 402, in _MakeRequestNoRetry
    check_response_func(response)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 223, in CheckResponse
    raise exceptions.BadStatusCodeError.FromResponse(response)
BadStatusCodeError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '598', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 18:17:01 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "Quota exceeded for quota metric 'dataflow.googleapis.com/create_requests' and limit 'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for consumer 'project_number:844138762903'.",
    "status": "RESOURCE_EXHAUSTED",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.Help",
        "links": [
          {
            "description": "Google developer console API key",
            "url": "https://console.developers.google.com/project/844138762903/apiui/credential"
          }
        ]
      }
    ]
  }
}
>

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 478.070s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190626-180039
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b2fd71e194e095550304f2745b11c675de0ea36669b095598a948768638022fa
Deleted: sha256:afe859aa51138e3912e651ced7ebb9105cc8ee88b7c77fe87724d893b2715e30
Deleted: sha256:29cbfc2689e380005b9a252f93bb85e8156231d675cdc34fc643e1a460cc8f61
Deleted: sha256:0a4b3338202625c47526ad2a33b56b6b6c79091a0703d2ee2004b689d8396814
Deleted: sha256:aebc9305d0c268dca8812bd7183e1543077463dcaa6d4520ba355f3f9fce67bb
Deleted: sha256:981a4afa4a95a75b09bf962105d7f85a0c06a5b3bc234c945ae907848e21f84c
Deleted: sha256:d998c7b67f0579fe2b8037bf80ff3b4c1af93c482f233743eac2ee7bf43c3987
Deleted: sha256:ecbfea3ee04a12453754a24d767df0d029cc4b722da1266691120dccff2edc89
Deleted: sha256:730ee3b04c5f6f6a455bf3a03ab7beecda35010bdd0b08b6bfefc30113385055
Deleted: sha256:78b44e97ebaec1506d90e822733c4b887d27fa8fb158995d5409857d8a931567
Deleted: sha256:07d3b673be02d42ca6cefbdcbc9f8dffc54c3dfbd7de35298b65f98555ce6a2c
Deleted: sha256:f5bf58aa63f04ee9b7c5cd3855f4d97eec2a4c81b92bc9feb6fe9ecf399ee267
Deleted: sha256:572614b062c6d0556dbd6919dcb94dc0b4a401782984885047242aba7095530a
Deleted: sha256:b707b716605a618aaeb946c8253103f9f4f484470c54486a64917ca467f0a4b4
Deleted: sha256:8de2c246a0492ef564403d20b13c4faa7aba3e05848e1cc65fe15fffc5412793
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b2fd71e194e095550304f2745b11c675de0ea36669b095598a948768638022fa
  Associated tags:
 - 20190626-180039
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190626-180039
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190626-180039].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b2fd71e194e095550304f2745b11c675de0ea36669b095598a948768638022fa].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3660

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3660/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4948, BEAM-6267, BEAM-5559, BEAM-7289] Update the version of guava

------------------------------------------
[...truncated 207.50 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-26_06_25_23-5348804763408608089?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-26_06_31_21-13065895319378069408?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 774.885s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190626-131704
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2fac4a3b878d50901a659de3fc6ace438298754a88e6502d7eb609efb75e08f4
Deleted: sha256:b97376418e1c4172ab79dbac056a507b73725e4fea31240cf1558d10919b2002
Deleted: sha256:7729b026655011015eebf9550d64bcd25a01dc00353092d10f1f6cb579e74a82
Deleted: sha256:a44cff44f56a409d7a1a43ad36b543d65c37e38432d3c289609bb0832474a950
Deleted: sha256:0686a0eb29fb791d56c1329aaa01a0b57d96e8771961439403daa3bf632dbfd6
Deleted: sha256:ef220c9fe35fa1a9b7cb823afe62886a899d5f819f994ca7aa3c4f9b17dc8f93
Deleted: sha256:f8e1b42cc9916666a66e6919a29afa3aedf7f6465aa6fb309ea69db8c6c44f1a
Deleted: sha256:a45976d1bbb342f5ad398daf13507f4304b648fbdde289c25c70a10da2af1dcc
Deleted: sha256:caf84abc07ccc536b1e3e6fac639fe444b33aad2142f7c6056c53a1bff97be79
Deleted: sha256:29c419239900e32133e074dc62cdb11997b41c7e8ccbcee090a9ac83423ec8f4
Deleted: sha256:a9e1a8458e276d4e93604111051db59e2b8a6fd3345b654aaab4acf2678f51b3
Deleted: sha256:faec6e0686d14ff500c5eec9c06815588ae03d9ef6024705c9b743c86517e871
Deleted: sha256:8bcc4681e35632a270a8a009c8decb0fe31215941686a906a2e1ca2e687d7196
Deleted: sha256:819507839328ca277e0b776d5397c00d58960ab2da8ef1615cbece23f353fbd3
Deleted: sha256:2e2acd11f773cbc228dcc8456b0c91403b4efdf4aaa7a467f6da5b9abc4aed36
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2fac4a3b878d50901a659de3fc6ace438298754a88e6502d7eb609efb75e08f4
  Associated tags:
 - 20190626-131704
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190626-131704
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190626-131704].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2fac4a3b878d50901a659de3fc6ace438298754a88e6502d7eb609efb75e08f4].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3659

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3659/display/redirect>

------------------------------------------
[...truncated 205.49 KB...]
copying apache_beam/transforms/window.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/write_ptransform_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/typehints/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/opcodes.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 663, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-26_05_14_55-5661818847762905581?alt=json>: response: <{'status': '404', 'content-length': '278', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 26 Jun 2019 12:16:53 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 404,
    "message": "(153517b8e86b8db): Information about job 2019-06-26_05_14_55-5661818847762905581 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... FAIL

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -368: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-346\x12\x04-344'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -368: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-346\x12\x04-344'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-26_05_07_47-2849454400700620521?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
FAIL: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1331, in wait_until_finish
    'Job did not reach to a terminal state after waiting indefinitely.')
AssertionError: Job did not reach to a terminal state after waiting indefinitely.
-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-26_05_14_55-5661818847762905581?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 730.577s

FAILED (errors=1, failures=1)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190626-120010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2338333a7ad966429f96d3313d2378eeef59cedd8d5c948656c8c49f27ae18c1
Deleted: sha256:d0155abf855ddca16e5df86bce4e6bb2c5c525c41acad73649c5dafe73a7dac3
Deleted: sha256:301e095f0b48730e4267bd36db1124cdbcfe29665cbab8ce860b363599a9d0fd
Deleted: sha256:5f4004995281684abfc44f93a62b9b08ac177c1d6f5279b19238ac49082051bf
Deleted: sha256:16eb1df32a3ea8c7422dd7eba441c04897a00f5fa114791957230d389edf2f6f
Deleted: sha256:7c0ee55b6d36cac9db8bcca5c5c5a1d0862cf8ad440ec6d48f906204f9301532
Deleted: sha256:9dd6560eb505d4128932deebaa39a970b49a20084e1e6a7e51b4a9f849e70ca1
Deleted: sha256:e50a6c126d4815abec5b71c058e1c39d839bbae056cf81a3e224548973adf942
Deleted: sha256:1e634b92d70aa4d8e2726bec37fbec80cc459a529a95b18673670dc16bea7495
Deleted: sha256:730dbe41d522fa8a3ca95053b882c6f8d0299f0068a78b7fd94c965b85e9eda7
Deleted: sha256:1627aef628dc3208922679b0e6a5c097a19a753490978078302f7bc89ea85397
Deleted: sha256:f575531726a41a6fb96a548ddd8b868900e51a1e341b2ced9265e05814261a70
Deleted: sha256:ae71c37174d782e803b37f3be4fdb1f2d9cb884b1ae61bf23ba5454cd78b935b
Deleted: sha256:d4e0166bfd4fab1d0dc16737609d77296b9011ccced8e3f3fff8a9e0bbdb5537
Deleted: sha256:7277fb23b649306bb73c30bc3e88212cf5687bde256fa3522cf5e146bb0c0bce
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2338333a7ad966429f96d3313d2378eeef59cedd8d5c948656c8c49f27ae18c1
  Associated tags:
 - 20190626-120010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190626-120010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190626-120010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2338333a7ad966429f96d3313d2378eeef59cedd8d5c948656c8c49f27ae18c1].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3658

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3658/display/redirect>

------------------------------------------
[...truncated 208.41 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_23_07_27-7918505632319920828?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_23_13_40-12067708306009499587?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 721.480s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190626-060029
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:01d484b3d61a82210454924d52913ffba0696ce5964d681ece74a7247546676d
Deleted: sha256:f2366d30bc6a1e65d0dba80c64825b6363a576bb1debd3ec2daef28b99c30f4d
Deleted: sha256:a5a4ae5ed54d60bd6fc7cbebc6c2f5677b31260f244d567e5b79f3c75a3a2779
Deleted: sha256:9337a3dcb4d47ae203933bdcd1920274cdcce5a5ffd2e3daf2192bb96cb8d858
Deleted: sha256:2d732815faa92dc3657ddbd322f7895c3e1cb3cd984a060858200130c7f8a2b7
Deleted: sha256:9f6c53ecf3fb2e6027c9c2102abffce418ab608f18e7ac1780e47ae9ca6f5f56
Deleted: sha256:4f9a82e4a20d63ee34e47abfe69686efcc3469f6e0dcf3031098db5b0b5be399
Deleted: sha256:cbe5d0a6a2f1eee95820c913a0b76aaae5065ed0f9d65fdb3af65e32a5a1a67f
Deleted: sha256:2e0a917a1e7853c606e764b575d046734321a0d73ad3488e265b7c8f07cbeb70
Deleted: sha256:eb542bd4a176ab34b5a2667b25f15555d77453221b0804c5763e3f7ada7131fb
Deleted: sha256:ffbaa3a5411e30d2900d9a8a4bf5301cd1dfb77e167ece932d6e3e00b6290bdc
Deleted: sha256:bc928c66de75ca03f9efe627cfdf0e2ae67724fbd8feb8749bfba0d480b7c5d4
Deleted: sha256:386a95615d550dedb79994d0523ab6f54439234da4dbf26825cd0d16286066ea
Deleted: sha256:c0c527bc3de9196b5bf1cffefe6d8483eaef5c763a7c76616ade9787cf4ca6d2
Deleted: sha256:f51f6d87abb96529dc36b4928e2e574281b4a37e714429f5375ed978888ad8c1
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:01d484b3d61a82210454924d52913ffba0696ce5964d681ece74a7247546676d
  Associated tags:
 - 20190626-060029
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190626-060029
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190626-060029].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:01d484b3d61a82210454924d52913ffba0696ce5964d681ece74a7247546676d].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3657

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3657/display/redirect?page=changes>

Changes:

[github] Drop experimental from python streaming in the doc

------------------------------------------
[...truncated 207.44 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -315: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-257\x12\x04-255'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -315: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-257\x12\x04-255'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_42_21-8682120451149598089?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_50_14-15771768521306478378?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 819.485s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190626-003446
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ea6305a41da70a0e547b1e4914fa780b31aa09c10fab4763d2f4f778371a9601
Deleted: sha256:f8891ca073630479e971f8ec475245569b28f6bba8a18da4fb6cc7276402979e
Deleted: sha256:c4a2a3f6d6e29a1a2832f047f238077f740cb557eccce37c8f867a0ea5724175
Deleted: sha256:efc1dd2f3915085efb8c47d185ee1b9c13abe54617ebba668fd495443e84e946
Deleted: sha256:f74c7212980c4315b8120a5173fa577228472cea81cbfb1af6efca7b77ab2dd3
Deleted: sha256:0a5f9eb3920519cdd66f9156658c00d97508c2b73a251c18d9fd909bc91c4e39
Deleted: sha256:48d3e52b05bbc460b635b5e47273c4a4685c659f0dbdb71fed7116dae0896707
Deleted: sha256:b49bcfe4d74cfd9901795eb874966e2909509563d3d3da21511715bf0c80e56b
Deleted: sha256:720f686ee4f326706827c7f62497d2f111b8e124b5b33dee33c57e7b8456d558
Deleted: sha256:f0fb677c40b2021e20f4c8d056c70394d7c2bdd3daced7a28f50c71a9b4dbbba
Deleted: sha256:f3c20aa04dd291d3ab68e085c9c3272a0558a1aa8313f483246cf4a521e33ba4
Deleted: sha256:be9985b5a81667e2cd290d27717ec4f5af09823fccb55e29a69e5aac62000005
Deleted: sha256:1c60d5135bd85f7055097114d67e9410197a45b96a13a25eed798eae8d164e7f
Deleted: sha256:fe7b8f37e31c46e451c02146abfadb026392e52ed9e68f1d264c341c0ff4d6b8
Deleted: sha256:9f202700544748f1eae5efa22802aaccd9573f270af2ee0dcec44c7e1be89ce4
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ea6305a41da70a0e547b1e4914fa780b31aa09c10fab4763d2f4f778371a9601
  Associated tags:
 - 20190626-003446
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190626-003446
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190626-003446].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ea6305a41da70a0e547b1e4914fa780b31aa09c10fab4763d2f4f778371a9601].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3656

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3656/display/redirect>

------------------------------------------
[...truncated 207.40 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -316: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-258\x12\x04-256'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -316: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-258\x12\x04-256'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_21_09-12204522400759206743?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_27_42-6602011306892282636?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 809.141s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190626-001137
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ccd702246138e27b757970b3f9d0477dd9ec070471da126a6d904c5964203b72
Deleted: sha256:990c90297a6fd67ec4a20f52a560b6da1e0cf3533947d75c00946ebd45493fe8
Deleted: sha256:18e957d6208e2d1572c121b099c7a063a0b2f11f9d913143f88c11573f47c160
Deleted: sha256:718601fe5df6f73e7815c0241f33ef53cbeb670074ac85c70615d1b70edc6ba3
Deleted: sha256:0fa1c805fce652056f3747cf4cafafaff17fd5861fd3eede1a70796e9ae24d03
Deleted: sha256:0e742760771b235b9176521a886a976ac63a36082a95e1b9734d2bbc346480d5
Deleted: sha256:928481902614ee6ec50342e094753fb37a79fdbf429e6f8c356b7efed7ac9008
Deleted: sha256:12c24ec0249723ee474e9e0318168037245b6126fd551a828c2917e85975d71a
Deleted: sha256:e2ca9ea23720af1a915f13c6632d19e55d4d33d1aa55d77f9c58e55a67c30eae
Deleted: sha256:26c977c4c6618c4a75fafd4387df3413706b268e7dc309c25bf72e54cf6e55d5
Deleted: sha256:5e64f019afc3fd548cf9cb85f7d3ddf2e169367615605647d3639617240b5d8a
Deleted: sha256:35a9ecdd8739c7691c4abe500e7d39a1d76097805bb0ef2458ca0475f6051bbf
Deleted: sha256:add0b6cdd45cce0db4248d00bf3bc3c02e8bbec6d4b2caa7aa837476e926fbc2
Deleted: sha256:50fb0e0b88895697c3c177525af3937673bcd5364dedb4ec2b3c6c38319d4b4c
Deleted: sha256:04da7065e40096e33e61553f5fd7a31aad712a31198235136b39f4709f6860bc
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ccd702246138e27b757970b3f9d0477dd9ec070471da126a6d904c5964203b72
  Associated tags:
 - 20190626-001137
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190626-001137
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190626-001137].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ccd702246138e27b757970b3f9d0477dd9ec070471da126a6d904c5964203b72].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3655

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3655/display/redirect?page=changes>

Changes:

[github] [BEAM-7013] Add HLL doc link to Beam website

------------------------------------------
[...truncated 199.57 KB...]
copying apache_beam/testing/benchmarks/nexmark/models/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/models
copying apache_beam/testing/benchmarks/nexmark/models/nexmark_model.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/models
copying apache_beam/testing/benchmarks/nexmark/queries/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/benchmarks/nexmark/queries/query0.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/benchmarks/nexmark/queries/query1.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/benchmarks/nexmark/queries/query2.py -> apache-beam-2.15.0.dev0/apache_beam/testing/benchmarks/nexmark/queries
copying apache_beam/testing/data/trigger_transcripts.yaml -> apache-beam-2.15.0.dev0/apache_beam/testing/data
copying apache_beam/testing/load_tests/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/co_group_by_key_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/combine_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/group_by_key_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/load_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/load_test_metrics_utils.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/pardo_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/testing/load_tests/sideinput_test.py -> apache-beam-2.15.0.dev0/apache_beam/testing/load_tests
copying apache_beam/tools/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/coders_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/distribution_counter_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/map_fn_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/microbenchmarks_test.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/sideinput_microbenchmark.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/tools/utils.py -> apache-beam-2.15.0.dev0/apache_beam/tools
copying apache_beam/transforms/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/combiners.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/combiners_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/core.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/create_source.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/create_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_combiners.pxd -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_combiners.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_dataflow_distribution_counter.pxd -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/cy_dataflow_distribution_counter.pyx -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/dataflow_distribution_counter_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/display.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/display_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/dofn_lifecycle_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test_it.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/ptransform.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/ptransform_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/py_dataflow_distribution_counter.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/sideinputs.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/sideinputs_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/stats.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/stats_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/timeutil.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/write_ptransform_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/typehints/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/opcodes.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_48_19-15664827633725863484?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 919.403s

FAILED (errors=1)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-234111
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:eca632fd489e510f5a118a2fc7d6283989520a58e182c9c5847f8604444af13e
Deleted: sha256:6e6bbb7959863979de964a09f868e8aa9cb52f1f8c4cf3f04b4061c7f5f004e7
Deleted: sha256:3edc65a3a1869591abaa568c21d0c290616d200f48d0cab0b385272fbc7a509b
Deleted: sha256:b4b3a4a2ae772dd3ad6eec9dc6a579743992711ebabf3cc5617f5d07bd82649e
Deleted: sha256:7b8e505c7461a276409986a6665957f5f42dde5a71862ac466f28799cca4d5b1
Deleted: sha256:ea4562241d925cf565fc2f2eb8ed5beb969a47293ac34d909130dbf2fcf9452a
Deleted: sha256:ee4cde875442128f929ec6449ae8aaecc136a543326afd6024da7e310801b5ed
Deleted: sha256:f60508ed2016e8f2a1283e9207a844bfdaa97196b8826f5fe5a81ce40fd9b5cd
Deleted: sha256:b7ed0e832c2e4444ed67bf34073a0c19b35696235519062b5405c880eeb47197
Deleted: sha256:6b2442e056ec5ddb8e35c36a6dcd01fd84c881bff03d65fc7d9e5573cfb7861f
Deleted: sha256:239ef455ceac7649124e0d629f0c40bbadff5eabaa8a96c2709141e7dc9eeed8
Deleted: sha256:58e9c384cd1cf65cb01080152f16380343cbc844e26a59878af8916920db998e
Deleted: sha256:8448184ef498f913e80dff1265ed66319aeb4c645fd565b984eee565d04417a1
Deleted: sha256:375e38dc8147a31408e8631a8b8a0727438d28fc8ca03043068463d65d513735
Deleted: sha256:ea72c63ed9ed92cd448fac95fe1d5ba32d97ec12ec38993ca140155f5342e886
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:eca632fd489e510f5a118a2fc7d6283989520a58e182c9c5847f8604444af13e
  Associated tags:
 - 20190625-234111
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-234111
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-234111].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:eca632fd489e510f5a118a2fc7d6283989520a58e182c9c5847f8604444af13e].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3654

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3654/display/redirect?page=changes>

Changes:

[aryan.naraghi] Implement splitAtFraction for the BigQuery Storage API

[aaltay] [BEAM-7475] update wordcount example (#8803)

------------------------------------------
[...truncated 207.73 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_11_52_31-11577367538246715194?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_11_59_49-5332975983266506041?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 814.708s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-184501
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:9c952f0c98412a239fe8826a17072e37acd14c12145275e7107e830fbba50792
Deleted: sha256:d150bdc086340e1c71ab323bbe9d8f40806b1fcfc0ff388be0dd70d255f419cb
Deleted: sha256:06462a9855612da27c35777ba4d26be7c7ba8cb01e4a183cb13d4d0d2fb45c76
Deleted: sha256:fe1dfe5c54153eb404b52ab954fb0b97d3e2ba06efee58c0bb1583fd9ab01887
Deleted: sha256:70e5b2708e8d9b7fc0dcde702794c4dc6b54efbf44d59aba8192aac568971efa
Deleted: sha256:490246b1f3541fdb978d13008131a0eb8575f111a66eb32be9dd45facde50dd9
Deleted: sha256:d7aada0263162ddc676127479b387af4c79b0b3b3017adfcf27a4357f24f0dfd
Deleted: sha256:0a8ce9059abac755d9bde123d9892a778d133dba1ee887b166e9a0278cc0e185
Deleted: sha256:75fc4667c21e93561c7742d8f94a48cda93abb1c0c7d626293597ead0672dd20
Deleted: sha256:24781471ab383affd5b24110a0576e74a199a93e3ced584fb80cc13c0f9c6b94
Deleted: sha256:c9037bba263047106d0ce6b7b6b3c7e62adee015a78818dbd7881ebe106780c5
Deleted: sha256:00ea15fb42f7737738b097adddfe61d651a9203128cb1f0db3786a294dd2e705
Deleted: sha256:d232874b99bed940923e3ab3429154fae7d43439e389368aa87a5eaab23a4a23
Deleted: sha256:a400fe28b11132544284aff31fd359254f3db98e4bc4dea70a2b6ba28b6b47d4
Deleted: sha256:fc1e0f01b522188cb52616b0893b761d87dc7b834adb54a7a27f2d5eebfad76a
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:9c952f0c98412a239fe8826a17072e37acd14c12145275e7107e830fbba50792
  Associated tags:
 - 20190625-184501
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-184501
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-184501].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:9c952f0c98412a239fe8826a17072e37acd14c12145275e7107e830fbba50792].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3653

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3653/display/redirect>

------------------------------------------
[...truncated 206.99 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -347: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-289\x12\x04-287'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -347: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-289\x12\x04-287'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_11_09_04-6905578210476031806?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_11_18_42-7226590769032255125?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 979.528s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-180020
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:1a37ddca58267746a5779edf1181c538aff6b430f2bb03772513b2ad898dc963
Deleted: sha256:efca6d995bfbacbe67778c6565fac75583d899b286c0c4f2a8351bfbac442f61
Deleted: sha256:3badc3bf4ad5dc2c1db838a167ed53bd53514e3ab4ea5fa4dc3916967eb82759
Deleted: sha256:9e65a354d114385cecde2d23b9fca4af130be0b5bdfc0f99cce85384f383209a
Deleted: sha256:4d37957a882c91ca8ce0271df59263b5064785ed8d5d3d4ae03e00ace6a01139
Deleted: sha256:64d317d28ca227ba73da519030be21c64829da3c3c3754eb50f287903302335b
Deleted: sha256:4914109826efd3550736c00dcb037d96ab59242b8cd6d8cb05e1996739f50afc
Deleted: sha256:2fa70706e5d5949bdb80c386b39a2d4fe8f1b43f253adb16e6fb5516c16fe7a2
Deleted: sha256:6e2f7b9c5e8fcd926b108a3ee87469a72d9fbd78934067a63add0aece36c22fd
Deleted: sha256:53cce98ee851653de762a58af6e6aa1dc32651643270818d596d1cb2c317245a
Deleted: sha256:172b043b3131ccfc22948cae7c3fbe6aea4c0597018d1a20379e48ebf9d99fa7
Deleted: sha256:2bb90de92f2c5626ab350b91c37486fc17c582dce2643031ba081a929d25d2ba
Deleted: sha256:46c8eefeef5bcb92adf62bfac51d93e14bd0f8b385fc7911da284a21b5bfebf1
Deleted: sha256:c90cadaf66db910ce411d98116c66ebbc5dfe1c795900cfad398b5e9abbb4843
Deleted: sha256:111852a8606cf613c53149ad35d6118db5e70f0fd7d25f62645085decc249725
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:1a37ddca58267746a5779edf1181c538aff6b430f2bb03772513b2ad898dc963
  Associated tags:
 - 20190625-180020
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-180020
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-180020].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:1a37ddca58267746a5779edf1181c538aff6b430f2bb03772513b2ad898dc963].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3652

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3652/display/redirect>

------------------------------------------
[...truncated 206.97 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -318: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-260\x12\x04-258'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -318: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-260\x12\x04-258'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_05_23_10-8620514470769447879?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_05_29_48-3820152969964359933?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 784.618s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-121511
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:58ba2423c7137a79ec9077ff4c53993c6805bb4e9b06fade46f73b523a6594e2
Deleted: sha256:0a1a445559f752e9c03f6980e83c5c8be9b76ed0d92e3dde2cdbc46dc4e2993e
Deleted: sha256:4943ea7264122385b0b940a78442508047ec9bd6499ea3b994ff72ad43047f55
Deleted: sha256:40acbf8b335cab270d5cb842a6327acd397617de1c199ef74850d388454a3b78
Deleted: sha256:65ae62a3b7f4d153782566af70e045d5c18086562541e014bd395fa2a74f1b5f
Deleted: sha256:738f1df5bc4c2aa80685dda34e571d6ef29bd2a70d0e82b7b2f406ad68276736
Deleted: sha256:9aee0d549663aa8138ca68e8ae1dfec345c5015f7c91e069ada220a35a07b303
Deleted: sha256:40061f686931dc1ad80f22b95ad0bd0f499b7c4bc6c71b9839407d96e0f081c7
Deleted: sha256:fc7ad554938d9dd4704becd9fa8a000f79ecc3d1f292a484bfbe6ab3e9d44e20
Deleted: sha256:ab5fea9017a6032061ccc1d8ef0a2ac6dc3414d360a8ffe12bd8513a2317c35e
Deleted: sha256:cc934394a83c533d37ed0a6270847ab1983ab25541fe1c1845fdbeee1da5a7e8
Deleted: sha256:8dd656f915118b545c40a48e3c85f5db741592f8ef3ff3f150cfa7669ba16215
Deleted: sha256:c5eb531978d2414d95433ab5c62c03804dd5d2fb7b4e83037779c4b48b46e6c6
Deleted: sha256:809d94444869db245d275aff72316868672fa2b8a9ca3fa852f81204ec8fc0cc
Deleted: sha256:1a4fe6f56af250d2082e8e073b7a1b090402aa19372843e1e4546c9ad0bb580a
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:58ba2423c7137a79ec9077ff4c53993c6805bb4e9b06fade46f73b523a6594e2
  Associated tags:
 - 20190625-121511
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-121511
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-121511].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:58ba2423c7137a79ec9077ff4c53993c6805bb4e9b06fade46f73b523a6594e2].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3651

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3651/display/redirect?page=changes>

Changes:

[robertwb] Update portable schema representation and java SchemaTranslation (#8853)

------------------------------------------
[...truncated 207.06 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -368: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-346\x12\x04-344'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -368: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-346\x12\x04-344'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_02_20_48-10327141686041750034?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_02_26_50-8069022255985304548?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 764.507s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-091309
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:0e579e234902ae26321c9886dde8d5d5746ebb11051e1d5b8d8f9d01e4b2e809
Deleted: sha256:c4da0a65d8186c58a600d55d6419432ac2ec604cc634b46fb4e25720d26865bb
Deleted: sha256:38913cb0b92d09d4e9285f4bdbef0fb9fc69922f57470bd2d606057eb8812c1a
Deleted: sha256:e1d29eff3f5195e558abdce49a23636177951229cfb8582b8212308bd85d0493
Deleted: sha256:da357e3300af6e90e33f31b03795198c57a880a04dd63e9e92d3660194814b64
Deleted: sha256:c1b370e028e9012db44c7513c46403df0d70f3340676c600c081b57a83125721
Deleted: sha256:6f6f8ab517808dfb3cb757c192f0f7d2fc85e99e45b8b3e5b14f8507e8ad5b12
Deleted: sha256:517534e1a0af10b6724448e4612a6bec4aee5e0356e02208e032fa703e8dc6b4
Deleted: sha256:93cd25c92e8ebecb2c68d70e4bdd92a994f1de4b5a62259d6cbf003d3a800002
Deleted: sha256:ed6fb90db747eefeff837662a8cdd10b9c965ff6966f223f67e7c5f54c8f1fc6
Deleted: sha256:5f769c1e581059ce1d472806abb04c48010d4d887e721a1059a2ef5730d0298a
Deleted: sha256:cf782328c8430e2f44c8bfa8bd12bbe6fddb0db4e240b51a739da7e9655586b1
Deleted: sha256:9126616473f6a26a9044c23be2df97d5f49387c6108a11a9c0618ba70d30de04
Deleted: sha256:22cf64125425e961e1d5527c02e19fbb16fec3e11b779bc42697e2a540d16947
Deleted: sha256:78ea4c3b2fbc4f9df28fdb605373c22594f278b2040222452cda651abf3dcbd6
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:0e579e234902ae26321c9886dde8d5d5746ebb11051e1d5b8d8f9d01e4b2e809
  Associated tags:
 - 20190625-091309
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-091309
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-091309].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:0e579e234902ae26321c9886dde8d5d5746ebb11051e1d5b8d8f9d01e4b2e809].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3650

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3650/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7606] Fix JDBC time conversion tests

------------------------------------------
[...truncated 207.76 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_01_48_50-5210554799178084792?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_01_55_33-8345684414484712915?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 784.193s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-083459
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:1cfced40bf8eecd552019821790e686bc414feb2005b93a4c97eaf86caf60231
Deleted: sha256:0158fa743d259578b0b5a93ae52dd54d7579da1a3a67395dfa046e64141c118e
Deleted: sha256:70e2b49d619087c655779c3cfbf81720217ffff1c1e52deae6205de89e7d348c
Deleted: sha256:90e3cc76739f0ff9bf9e2c1c7aa9ba283b830088f3d15b95fe65d9f9e1e6a68b
Deleted: sha256:abf1385b0e88438394325a9ec8c4983e22f42ccf741f2b232c53b4b03cf69d9b
Deleted: sha256:39acb049556e1780508a43242b0cabf789f0c7864fbaef39d48b3c157aebef2e
Deleted: sha256:5a4db428d098a9550d615e528adf3be9f81c27357426c9f06464d280a8f3f253
Deleted: sha256:2da5e9869496fadeb991512f3e4e0ea84359531bb0053d690bff0b1e56df5704
Deleted: sha256:2a97cc58fa78a15a1f9a2190e7409aad5f4803f2e7079d6797e9c0775138e826
Deleted: sha256:8663cf0570e81a3234416b115178f43b2fd7f6cc2917a91e981370c36a816c92
Deleted: sha256:f5cbc30f192fa384bc46a576af452678726605d0c88106250d2ea8fc4953e024
Deleted: sha256:9f472e03f5ccf27a992696fef906548766f1ab8d108319f3a956139b370555e1
Deleted: sha256:fd3a6ff1ef919ee760e9a66460b887b78910009245d79b4ef68aaf5291bd92c5
Deleted: sha256:05602951d2a3ba4192e31dfc5aca9bfe683fb3d99ea812684e8e43b266bcb7c5
Deleted: sha256:94bf8032ff6f7f3f96f975a6c9b547329af22e1a1bd71b695f5f8c36c66c47ae
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:1cfced40bf8eecd552019821790e686bc414feb2005b93a4c97eaf86caf60231
  Associated tags:
 - 20190625-083459
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-083459
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-083459].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:1cfced40bf8eecd552019821790e686bc414feb2005b93a4c97eaf86caf60231].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3649

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3649/display/redirect>

------------------------------------------
[...truncated 207.83 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -316: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-258\x12\x04-256'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -316: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-258\x12\x04-256'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_23_08_18-8367375198402428464?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_23_14_01-3746139411192392061?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 754.003s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-060007
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:48af839cb45e800d4de51cfc27fd3addbaaca982a1090d37218fcede2fba7da5
Deleted: sha256:61a81d28847fbad30cce8520a29f34307bc621642005128696d8ce1f7ed395cb
Deleted: sha256:2396791d37852bc8add2521031cdf33f7026f22ad72466bcb8b323c1db5abf20
Deleted: sha256:1cf1673aaa0a907a7fe592c78e17e049cf75414f17cb5efb81be69cba67fefdd
Deleted: sha256:c67f3e20eb314f123d8e990bb70063d388787904be4a65f36a910c00534cf710
Deleted: sha256:14fc550fa0a31ffd628f792bb61adfe1489c97f29ebb0463c4f70d374ecdbb29
Deleted: sha256:47badceca8ae595ce8c42508e9103ba3c71047821a349056eff30444191931fa
Deleted: sha256:477a0f3113a16ee34c99063b5037668e03c80f2acdbd538e08e0eff2d156d5e3
Deleted: sha256:8632f74e863240fd2ec907bdf8e6445a28e138f3f862119681353483cb727d3a
Deleted: sha256:b194eea8416ef126950f440569f2155751a30053790080c6650fad2117dff60d
Deleted: sha256:7f0b441ed40a77160ae9eb25e27bc992fdd6f1b953f4ce7f077d31e6c5236313
Deleted: sha256:20018531c7f904f9a76134087592d7aa2773ab94881e35cdd7c5019545360092
Deleted: sha256:16c34f6f71857c5495797a65ffdceebb579152a73177d03ceda866c76942eb3e
Deleted: sha256:342ac4ecafd0507692112e7289fd61ad97430c40a86b791f579d4e551bd48c3b
Deleted: sha256:9b92f936a1046190f625a7034efe910b3cddf29c8e631feed1a48ab0db4952aa
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:48af839cb45e800d4de51cfc27fd3addbaaca982a1090d37218fcede2fba7da5
  Associated tags:
 - 20190625-060007
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-060007
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-060007].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:48af839cb45e800d4de51cfc27fd3addbaaca982a1090d37218fcede2fba7da5].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3648

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3648/display/redirect?page=changes>

Changes:

[aaltay] BEAM-7141: add key value timer callback (#8739)

------------------------------------------
[...truncated 206.96 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -382: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-324\x12\x04-322'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -382: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-324\x12\x04-322'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_19_18_46-1043663461619361562?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_19_25_10-14742940526786246366?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 772.688s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-021140
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:fd381f831a7960090fc9988b9bc71d2903a8bc4e9bcf6901f23136a18593f089
Deleted: sha256:3735e707e67c43e516e3018621920745a18b8d7088c32ad04170db972086c6ce
Deleted: sha256:eb25457ce471508b7f43503fe51e2a236ca549fbf8218f963d16b984de526d7e
Deleted: sha256:5d81c5ecf2ede2dcc363249e9bb72993ca8c69606440b908dbd5c6faa7cc83a6
Deleted: sha256:7534a7b078182fa4a609e5e259dcfb772b0e13a30c5bdc409fc541317a51cc9d
Deleted: sha256:d3c57a44c66f48a6416b16f616a00cff79623234253e94561cc3548798b14517
Deleted: sha256:6b6b743f767c5aa37497147b5374757e90a629c841edf50183511fe4410b09a8
Deleted: sha256:2d11959eb4447b346c53300b44c18243afed6c472b4de52e63b18438be8b49ec
Deleted: sha256:462ead3536248afedaa365abdf12685df5400d92b2fa7035c572dca65dde2fb3
Deleted: sha256:e560ec2b0636a41f3bb6d98f1ae335b6192fb5d2670ce8b5830a25716b129184
Deleted: sha256:ea62abecb96d751802d044d845984997e38ae41535f0669cf669b92c60e1493f
Deleted: sha256:4f13a6039564e6a1a79334b2b54608b2206b3eb3cba8d77c570195bd06364081
Deleted: sha256:96c777fc03dc80469e238bb5242e6d6f41b0b9b6c5058ad2a204a9e5ee6a5b3e
Deleted: sha256:6d0fac852d4040dd1aefd54638f53a4b1fc06e100b5ce56c1442ab415009f4f1
Deleted: sha256:9d031107d741539217d18bb044777f07c601a255f1bf96a343936e37d4e1f889
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:fd381f831a7960090fc9988b9bc71d2903a8bc4e9bcf6901f23136a18593f089
  Associated tags:
 - 20190625-021140
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-021140
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-021140].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:fd381f831a7960090fc9988b9bc71d2903a8bc4e9bcf6901f23136a18593f089].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3647

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3647/display/redirect?page=changes>

Changes:

[dcavazos] Add Python snippet for ParDo transform

------------------------------------------
[...truncated 207.30 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_18_56_37-7520973017983395620?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_19_03_10-12421192430788397332?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 894.286s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-014915
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:5a07415e46db7016823bbb4d3113c56d6d3fc29a1277fb7fae47af71d9e93ab3
Deleted: sha256:33cd5a212e839ecddd9f64f78bfabe39bead394a5697d22ddfc0bcb6bd99954f
Deleted: sha256:de66ae7dc73d4907993257810d3e8e3f6ed213c299f8f3590edf975fe856c149
Deleted: sha256:2cfb79d2d977f3f41c7c792a79860212b5fba463b1d261ec818f6f0b869d375f
Deleted: sha256:6b8b5aadfde111d2fb673b676ae683436d8ac3e547fe3471df91258d1432417f
Deleted: sha256:7e7316f2862b4739c2afa83f188c093dd95f1d67476b17f3a26104a15685d76f
Deleted: sha256:a9997444c302db4359234b24fb56efdfe3617a2766dcd0992ed6e6dc2aa3b67d
Deleted: sha256:a0761bc8afd95044bbe5818fd8437163b0b71564e0cf4a822b23f6a1a477166f
Deleted: sha256:f0cf138bb3f21644694309e0b7d4cd06040f7a00f9de826afe545aef31334fa6
Deleted: sha256:f4009d4387f3c4c40a00382cfc78b191cfdc5a539d7a7b5430b624ab28e939eb
Deleted: sha256:e6e458ca2cf4fcd3b88b68dc9ecbe3d154a53c20b39262af4245346ef57750b2
Deleted: sha256:9eacb060fe3466d561b2bdc1ddc17354a4eaef8c3f336d08b6dc4ecbdae10e35
Deleted: sha256:2408f8444c0cafa2825ee4caf0a5a3ed44c10825e5bebe1d9fb91289d32b5abf
Deleted: sha256:5232c606f897588ae6b18e3d1b96b5e4e64aa906d0b963e5aa1b927dd75dcc6d
Deleted: sha256:c1124f54ac3ad2ab9e6b99ca2e45515c800c067caf3c0cf9a924d460b9a46e0b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:5a07415e46db7016823bbb4d3113c56d6d3fc29a1277fb7fae47af71d9e93ab3
  Associated tags:
 - 20190625-014915
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-014915
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-014915].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:5a07415e46db7016823bbb4d3113c56d6d3fc29a1277fb7fae47af71d9e93ab3].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3646

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3646/display/redirect?page=changes>

Changes:

[markliu] [BEAM-7598] Do not build Python tar file in run_integration_test.sh

------------------------------------------
[...truncated 207.46 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_17_07_55-2430523207125955821?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_17_14_43-15410627741436876041?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 784.156s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190625-000021
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2909a46f1e186e236e7b0b78a9bac1419edf7e1c5968136a6e468f4bc6ce01b1
Deleted: sha256:cb52626b449f9c84936ff2cd9226f5bbe1e83bb2fe52f77f510e6a016b04e468
Deleted: sha256:22fc9b55e1de3612ce1aca82463695d45a1a9db17cb3bd3f769eefd84c5566df
Deleted: sha256:4865b5ef9d12eda859a58c69e483bec6ae1f220634802435fdccd92c4a07b36d
Deleted: sha256:f7a06db75aab6c29217a570a699b36882f18b312a6fa92f2a5050f9efd16cc5e
Deleted: sha256:3f51dbbd5f66a79f5a7a7fd2a01af72cb020691bb7fd9f9704245304d53477e6
Deleted: sha256:65c178fb49bcc58697fd7ec43e66fc4ef719c8c4442ee6c3c725f9e4de29a842
Deleted: sha256:2167f86e1261c16bb19ca5dd88e5eec7c07da9ecac61390cb8953d2f73878c1b
Deleted: sha256:fdd9027c256cdb94c1cc9d89f722dcb6ceafe0aa258388672a16d07d7761f433
Deleted: sha256:64422e39223072e1943507114be8360fe61da676a3706c5e17c6233c60a62b74
Deleted: sha256:db370053507914cb8c0894f7c20c67facfc3c21dd6886d32a4da6209477b01c7
Deleted: sha256:ac6030e9f65b6a4d1af0a5a21882fcaf65d5bf0869039e2e648d84aa25222625
Deleted: sha256:a3082c14313b1a4216c1345cc72063bdb121d4ad1b961c2c76fcdc424310d150
Deleted: sha256:e0df3c57dddb289a0f4c23014599bff61d67e59c597e10d60a73ee68a8a7ac85
Deleted: sha256:6b895cb2c59643495d17f9ce6b081c70d16ccff30ed11c63af5e01825c21b7b3
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2909a46f1e186e236e7b0b78a9bac1419edf7e1c5968136a6e468f4bc6ce01b1
  Associated tags:
 - 20190625-000021
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190625-000021
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190625-000021].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2909a46f1e186e236e7b0b78a9bac1419edf7e1c5968136a6e468f4bc6ce01b1].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3645

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3645/display/redirect>

------------------------------------------
Started by GitHub push by markflyhigh
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from https://github.com/apache/beam.git
	at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:894)
	at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1161)
	at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1192)
	at hudson.scm.SCM.checkout(SCM.java:504)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
	at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
	at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
	at hudson.model.Run.execute(Run.java:1810)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git config remote.origin.url https://github.com/apache/beam.git" returned status code 4:
stdout: 
stderr: error: failed to write new configuration file <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/.git/config.lock>

	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2042)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2010)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2006)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1638)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1650)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.setRemoteUrl(CliGitAPIImpl.java:1284)
	at hudson.plugins.git.GitAPI.setRemoteUrl(GitAPI.java:160)
	at sun.reflect.GeneratedMethodAccessor49.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:929)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:903)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:855)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
	Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
		at com.sun.proxy.$Proxy114.setRemoteUrl(Unknown Source)
		at org.jenkinsci.plugins.gitclient.RemoteGitImpl.setRemoteUrl(RemoteGitImpl.java:295)
		at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:882)
		at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1161)
		at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1192)
		at hudson.scm.SCM.checkout(SCM.java:504)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
ERROR: Error fetching remote repo 'origin'
Retrying after 10 seconds
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from https://github.com/apache/beam.git
	at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:894)
	at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1161)
	at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1192)
	at hudson.scm.SCM.checkout(SCM.java:504)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
	at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
	at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
	at hudson.model.Run.execute(Run.java:1810)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git config remote.origin.url https://github.com/apache/beam.git" returned status code 4:
stdout: 
stderr: error: failed to write new configuration file <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/.git/config.lock>

	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2042)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2010)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2006)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1638)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1650)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.setRemoteUrl(CliGitAPIImpl.java:1284)
	at hudson.plugins.git.GitAPI.setRemoteUrl(GitAPI.java:160)
	at sun.reflect.GeneratedMethodAccessor49.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:929)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:903)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:855)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
	Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
		at com.sun.proxy.$Proxy114.setRemoteUrl(Unknown Source)
		at org.jenkinsci.plugins.gitclient.RemoteGitImpl.setRemoteUrl(RemoteGitImpl.java:295)
		at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:882)
		at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1161)
		at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1192)
		at hudson.scm.SCM.checkout(SCM.java:504)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
ERROR: Error fetching remote repo 'origin'
Retrying after 10 seconds
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from https://github.com/apache/beam.git
	at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:894)
	at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1161)
	at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1192)
	at hudson.scm.SCM.checkout(SCM.java:504)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
	at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
	at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
	at hudson.model.Run.execute(Run.java:1810)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git config remote.origin.url https://github.com/apache/beam.git" returned status code 4:
stdout: 
stderr: error: failed to write new configuration file <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/.git/config.lock>

	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2042)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2010)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2006)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1638)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1650)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.setRemoteUrl(CliGitAPIImpl.java:1284)
	at hudson.plugins.git.GitAPI.setRemoteUrl(GitAPI.java:160)
	at sun.reflect.GeneratedMethodAccessor49.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:929)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:903)
	at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:855)
	at hudson.remoting.UserRequest.perform(UserRequest.java:212)
	at hudson.remoting.UserRequest.perform(UserRequest.java:54)
	at hudson.remoting.Request$2.run(Request.java:369)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at hudson.remoting.Engine$1.lambda$newThread$0(Engine.java:93)
	at java.lang.Thread.run(Thread.java:748)
	Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote call to JNLP4-connect connection from 103.55.66.34.bc.googleusercontent.com/34.66.55.103:46040
		at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
		at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
		at hudson.remoting.Channel.call(Channel.java:955)
		at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
		at com.sun.proxy.$Proxy114.setRemoteUrl(Unknown Source)
		at org.jenkinsci.plugins.gitclient.RemoteGitImpl.setRemoteUrl(RemoteGitImpl.java:295)
		at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:882)
		at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1161)
		at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1192)
		at hudson.scm.SCM.checkout(SCM.java:504)
		at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
		at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
		at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
		at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
		at hudson.model.Run.execute(Run.java:1810)
		at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
		at hudson.model.ResourceController.execute(ResourceController.java:97)
		at hudson.model.Executor.run(Executor.java:429)
ERROR: Error fetching remote repo 'origin'

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3644

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3644/display/redirect?page=changes>

Changes:

[zyichi] Fix erros in py sdk io utils and add unit tests

[gunnar.schulze] [BEAM-7572] ApproximateUnique.Globally and ApproximateUnique.PerKey

------------------------------------------
[...truncated 207.20 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_11_38_51-14914624601743640559?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_11_44_59-14158274389314876931?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 826.432s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190624-183108
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:964ee6f6d50264a3666a0417f60aa177b02462424f8fcc4b989496bf9dc2f061
Deleted: sha256:a7210a70a66c94b632f9a42af4bf21c6084737e29a21d680c9f8b98e51602a4d
Deleted: sha256:7903138d4552268637a70179048f900d70a73bb17ea3acfcd9c7b13e91ae93a6
Deleted: sha256:941bc901163d09e7485604247bbf70382ed72e5686adddbcc4b008a92ed21faf
Deleted: sha256:93134961c3fbce3c0516b7277a33f67b57c5ea041808e5e7c0756b69f7628417
Deleted: sha256:ba33ea297b295f48cc2b1db35ec9821f01c2c67793f751b32c193a9c891d818f
Deleted: sha256:967c53ae42dc09c49fb8e3e630ea9e5bf0b618c1c0544725a1f21d6036118355
Deleted: sha256:737be6882be88ded9e216019dd268f536d024e24e6f78154e81a53ee23526a6c
Deleted: sha256:5b10c5846ff6eed7ef462e709de9a1cf51d009f2db007a432968a5830c219c36
Deleted: sha256:9539ca9d9c26706b8de023cf7216029a0cecbcee51fe2aed3089369e52647a71
Deleted: sha256:570df37ed99e50f196b8405bb351f9d9f10a3ad1e995e5d5956766aba859b685
Deleted: sha256:92882acc21e718ef6b9252399ae617623294b43b337293229533d94b3f83a0dc
Deleted: sha256:61f9e60ee67d694ab436ff3b4d48a341046032b518b01e5e9dccfe00fedcd4b2
Deleted: sha256:02ec5734d367a16653b46a5502fbc9597a44722aed6125e8f3ce739feba11626
Deleted: sha256:78ccbd0463f49e13ee800ce949fcc7e07e3aeecea603c9ccdb59c62b7b4d547a
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:964ee6f6d50264a3666a0417f60aa177b02462424f8fcc4b989496bf9dc2f061
  Associated tags:
 - 20190624-183108
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190624-183108
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190624-183108].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:964ee6f6d50264a3666a0417f60aa177b02462424f8fcc4b989496bf9dc2f061].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3643

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3643/display/redirect>

------------------------------------------
[...truncated 207.16 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -414: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-356\x12\x04-354'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -414: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-356\x12\x04-354'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_11_13_41-13744276408750606718?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_11_20_09-8399324716852133659?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 859.303s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190624-180035
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:ad4db64c40b08cece35a7a8697628bc695c9d026d7912bbfe041eedd6e47940c
Deleted: sha256:f3600c5585b8a27441951669937a0ac525b330758cf9b5c4367668083e1b08bd
Deleted: sha256:9431013b882e68552279f0d0bcd0337c56bbe84f551f0c5b106beaee5edb60df
Deleted: sha256:7dbd93ba24e16c61bdf03feccc27a4e0207730a18645cb61c6ca44528500d31c
Deleted: sha256:b278cf26c40d2c1e89f0141ee556a4a75ffe87849407584cfb6078643955bbf2
Deleted: sha256:5424407e39e27dda3ebef4ddf6823887f6adfd36d10f5542867cb7bd17b785b2
Deleted: sha256:70cfe121bc3bfef17179e64908bc2cec81e758502a82d4e4145e00ade0744575
Deleted: sha256:a4ece1acdcd6a783f21c8479a6835323507133709640c8c93421d3d988f217b2
Deleted: sha256:56a1f78adcd8f01b3be7716212ccab6e88705411b35dcf40630e4fe6729ac84f
Deleted: sha256:e894504d6dad2c86f1a46563aca61b423d8426fb99ddc3b139fc62722cc5b557
Deleted: sha256:9431afd8296c9a358d4599de8900e94bb3018ec4c779efd0641b29830f432751
Deleted: sha256:2c91b624f3b2a44e42aedaa5d8a24b511a0660ed87b7886c3e3ec2b3b6c3359c
Deleted: sha256:c3377976fa498c62e563f8f1482025108daddd954604c80efa6e86439ccce947
Deleted: sha256:bcc847d8e05e62a6dc4ba96a73d5b20dbb9365507c20de180e41e1b24cbe2570
Deleted: sha256:1391cf340ea914dcf483b7fda88e960d45daa0d7b1e9584cf17d071a6ebd3fc5
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:ad4db64c40b08cece35a7a8697628bc695c9d026d7912bbfe041eedd6e47940c
  Associated tags:
 - 20190624-180035
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190624-180035
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190624-180035].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:ad4db64c40b08cece35a7a8697628bc695c9d026d7912bbfe041eedd6e47940c].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3642

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3642/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-7307] Fix load tests failures due to mistakes in PR 8881

------------------------------------------
[...truncated 207.01 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -208: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-186\x12\x04-184'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_09_35_45-7765648518252570930?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_09_42_43-4367307667223011036?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 783.693s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190624-162757
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:62117ae0a6ac99a9d28a6a71a5eca95e1e5ab3729c8fd224870c8ec5c4ecfec4
Deleted: sha256:848c621cf685c71a30942d573f56ed2b8268a919ba74bbf4d2a8646624e29544
Deleted: sha256:ed92e63d6bfbc25bbda63676080fd7480ca9ea445970debd8e82ab3206150203
Deleted: sha256:75262fec0070ba0e7d282c7e11403d88a5e4e1ed2f9abb36b76e84d63ce4ee0c
Deleted: sha256:a33c278962f84b119558df3502178495db6a66983654a5f60070967592998e49
Deleted: sha256:39eeeb18e02c300bbf8d4747935c2539ae4545d49a02a68efc3203a3339fdf03
Deleted: sha256:637e9486e5fe1fd2ab5576c6d779914120d09d96574558cfcfd0aef6d68db16b
Deleted: sha256:8dd4de2f6381548b2a8a9fd872830430b0b832c149e1fc0cef940cf35482b912
Deleted: sha256:396ac6813dcf1d634db738cfc9e0f5ae63df1b3d66815df3f621ef6c16b3be11
Deleted: sha256:9d57c3811a173b7f8c67890de8dc910708629cdad2001ebd0c8b0a2407c2f323
Deleted: sha256:fdc688e9780963f201ecb59c302712cc360ee47b4cf07f8bb498e6c7ae040404
Deleted: sha256:1af1e6da866aa59db945e89569bd03a831319f10ceefe24f8ffd1d6754e5ceb5
Deleted: sha256:2e0806411cbc51fe3e3a454576b82bfa60774915431179b1bf0a5c43035a54a8
Deleted: sha256:19616331bc4e4a04077069f7c7547e28a07cb0e5c9e5804ef27dc816f3a81bd0
Deleted: sha256:87aa4f8413b7ed4d2205b5fb402e25bd0e0e9b6be9d66564c2e164893b9a33fa
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:62117ae0a6ac99a9d28a6a71a5eca95e1e5ab3729c8fd224870c8ec5c4ecfec4
  Associated tags:
 - 20190624-162757
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190624-162757
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190624-162757].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:62117ae0a6ac99a9d28a6a71a5eca95e1e5ab3729c8fd224870c8ec5c4ecfec4].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3641

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3641/display/redirect>

------------------------------------------
[...truncated 206.81 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_05_08_10-14978396553309689548?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_05_14_58-11056205319412606211?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 739.668s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190624-120010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:27125962c94d787741b0462b4336de89363f5f62f5262f7b6d1f3e24bdd7340e
Deleted: sha256:203b0259dc306255ac1a0358fc742ee03d7ed6964d433b613c447b5a7e44f840
Deleted: sha256:c2e916d9bec0c5bf08cbe9a76bf26748700fac7cbc48f9d879897307ae9b96f1
Deleted: sha256:707d3064f50438e5aa0c5fcfd32977524596c3962bb178bc70bebc05f353beec
Deleted: sha256:47ce397fa3d4d4f9cf216b604b68896255bf04fbdd6bb9e60f54b1bf15bbae60
Deleted: sha256:0dab0b1d5722790b7c21777f54b805b4e955988e47f473e070c000507eebd25b
Deleted: sha256:033d276300e10dd29bfce406aa91eda12217d85be8b7f8442ffbe63ca6bca542
Deleted: sha256:aaa4e35f590cd8417684a6f0a436e5cd65f18711da47bf42c1f9744970c6b3a4
Deleted: sha256:12918bc0ba32e6c78488533af891f7f69dbccaeaa014c9e8b1bf13405b6006b5
Deleted: sha256:4265eb65eb75ffbd4c7f5e72712fa7db10d69fc8a99b4b7842ffda2126592721
Deleted: sha256:eb7cc37d31dd4278e96e02f826c6387c6cfceff1131c85e99b0a5544688c9a60
Deleted: sha256:dfc45ef1636adcd10034e8fb8049743e7dc7f84cf52651ad755be23b75b56dee
Deleted: sha256:c397c685c54924edfe1d2365e816d66ed3a4c483a73c2ed09adc4ab42048117b
Deleted: sha256:771a06cd9750db6afb1f638b2be24bcbbfb991007d59ffea89caf765cd4a7f80
Deleted: sha256:022f89d82ea071eb70e35ecac4d573cf6c68fc687288204ee58e680b9cf331ff
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:27125962c94d787741b0462b4336de89363f5f62f5262f7b6d1f3e24bdd7340e
  Associated tags:
 - 20190624-120010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190624-120010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190624-120010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:27125962c94d787741b0462b4336de89363f5f62f5262f7b6d1f3e24bdd7340e].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3640

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3640/display/redirect?page=changes>

Changes:

[lukasz.gajowy] Revert "Merge pull request #8561: [BEAM-6627] Add item and byte counters

[lukasz.gajowy] Revert "Merge pull request #8400: [BEAM-6627] Added byte and item

------------------------------------------
[...truncated 207.05 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -365: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-343\x12\x04-341'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -365: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-343\x12\x04-341'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_01_48_22-18074846279609473447?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-24_01_54_38-16859407556860219104?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 736.384s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190624-084117
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a929c105ab1b0542df66cc21ed504777cd0540f918cabf0cb0be97760891de92
Deleted: sha256:e8f38fb47b7d6f203214c4864fb1e3dc0191386ff9f9f1ce74447b077d301660
Deleted: sha256:5da456b8a860c11371166befd75b6bc75e10c57005c69b734b0164780a4c6d4d
Deleted: sha256:a67c487d7541eaaeef23932ba63c59ac19bb11f7a8ddae38354fb3706a0528db
Deleted: sha256:cbe8bb21cceb05b260dc0e2a99e59a54512345b1c6a4258c8017217b960466f9
Deleted: sha256:60d43e71ecdfd46737cf29994e47fb136dba69a7e6dbdd52bfe86a5439ba559f
Deleted: sha256:beaa40954e5c3d46e0487d58bf65fd426f9b47d2298e65b2dd0708c372b74a48
Deleted: sha256:98d2f3205d52afbdd605335c2689d4358182b91e6dd5758d111890bbe0eff222
Deleted: sha256:794f4bf2841a7b82529c6fa617e8e086459c11c7367e63b15ee63f745dfcd9c8
Deleted: sha256:2963db9ce05a369c121973ac70542982fc22b5a14a0254e8558ff83043163da4
Deleted: sha256:103169381e25218804447ff560d3793152f7048747f4945eda5415c21c8fc13b
Deleted: sha256:37966c348130305cf824478fd6ae7d22239861e83d8e1502c7f5c0081c7cd164
Deleted: sha256:485861e3f7c7c2077b5d4af18abbbe30cd67a6ea3c384fb75eb872af3273cc55
Deleted: sha256:b7d911c4e3b28cc43125c7e29cd4004d973081404ac9ca28bfc02d79932e3f73
Deleted: sha256:bf8e8bcd129def20a8eb61a7348002c17a45c864919e11788aa39a5ce51b7fe6
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a929c105ab1b0542df66cc21ed504777cd0540f918cabf0cb0be97760891de92
  Associated tags:
 - 20190624-084117
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190624-084117
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190624-084117].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a929c105ab1b0542df66cc21ed504777cd0540f918cabf0cb0be97760891de92].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3639

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3639/display/redirect>

------------------------------------------
[...truncated 207.75 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -413: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-355\x12\x04-353'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_23_08_22-6416523750659395950?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_23_14_20-10380673369403166390?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 918.433s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190624-060009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:bd2277b9e1e622154be94dbfda1de0d66200ee28cebabe686d7ef846a11ea6e8
Deleted: sha256:0cd6e859c21b41c1353bf983bb69f7caf9299bbe931a183907a980dde92ba943
Deleted: sha256:432f5e36b01556078857763d77cb84f96ff7726ee5e9e464e890a3d990d83ee9
Deleted: sha256:60eef9d0e7682e6ff8cd13bc2848752279176e7cb413d00c16eb531f74b291ca
Deleted: sha256:489e49e7f487d822deadc352cea061bdd41794e3eebe8de1829905b72e218d03
Deleted: sha256:266d731febe8d3dcc707a9e5f35983f09325a18824fef546c89810213bfd5c9c
Deleted: sha256:4018e4455fe6f1eb7b9f002b446f79a555fd7a1fb43047fa2d2cc9a7dca4f77c
Deleted: sha256:7333004d6ba5827412950219b55c8c8fe3508a3ac2e1bd1798a54b5bd02d16f3
Deleted: sha256:b4b310b0fcb9e14745656c14d4186beabea21f4e143e9bf33cf00918073dd074
Deleted: sha256:105bad6e5bb5727bc6da2757f2cecfdc6b652d23b273478818bcf77539b240c0
Deleted: sha256:af6f87dd6b761c7dc226a5a699ab670767d4bafb6062ce2049d64f8f950cb11a
Deleted: sha256:989501e3e5fab557e177d01388f1efb80d3b42d6409b25ca606b955fe07203be
Deleted: sha256:42a178e4f2940a67cff663212b2cf849aef371e863756b4b08a8731429f52f55
Deleted: sha256:1d4e9322cd4ce9ee040b692bb8b46be3fa47f509ce02264bef2b670be5550d51
Deleted: sha256:2ec3eab38d9b3df722465c7404b64e162b6a7b0407135b2d680f16811524e745
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:bd2277b9e1e622154be94dbfda1de0d66200ee28cebabe686d7ef846a11ea6e8
  Associated tags:
 - 20190624-060009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190624-060009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190624-060009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:bd2277b9e1e622154be94dbfda1de0d66200ee28cebabe686d7ef846a11ea6e8].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3638

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3638/display/redirect>

------------------------------------------
[...truncated 207.12 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_17_08_30-9143824873231229155?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_17_14_38-6196038347460217772?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 895.011s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190624-000015
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:4dd710d8d4897fd979d2e19b44e268e5e4cb126d24627314cb1754f27a88c38c
Deleted: sha256:03b2454e261c3300a42bf9bba79b3519725e8f648987b194c3289ddccebb84e6
Deleted: sha256:81b31f8b43338daefc9bbe0804a9a215752ced6c046fc373a982a4f2cc16490f
Deleted: sha256:143b53d8d7f7003a67a3f80c289e4b0b64517851b664b892f2e42e499bd2ee5d
Deleted: sha256:bd1b10a9ef28e23415ae44f8d99cac0a586c093c198032f6251dc391dab58177
Deleted: sha256:ff058f2de927cba9e4ede94d3a861696df4ad65b1be2fb14d620762de57fa645
Deleted: sha256:df7b17389387fcc76b98ef0cdce5c898c0d2483a18f4740aacb95cea858b76c2
Deleted: sha256:1e81def770495f47caf945d46f17d16a363c6e64bd00b9cd3f7ad53cbad4ad8a
Deleted: sha256:f7b5dc2d25404fd00c63aac7bd98bdd9cec774404ec477424bfa05c73ac574ce
Deleted: sha256:3a282e4451ccf557d73c05a8687399e0d26f80ec70503019f35b030bcb99b262
Deleted: sha256:58f3397adc8bccd3b3450656ca2bc261171929449196d3b5c8fde4991c73ceee
Deleted: sha256:7a822d7be9ef7e287371291f9c1386208690685d20800af691466b478f555267
Deleted: sha256:b4177585fdb46438c86e54ef50a1d3a89724697605be85b3c5fe6e9def8dc2f8
Deleted: sha256:9bf853f4589d566d739dbae094a09cf50839394669f9f1cdf2576f82b9c7b198
Deleted: sha256:3f766b2fdbc8874484a74c7479b3ec33101557a0e38ad66e343dea6b6badf69b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:4dd710d8d4897fd979d2e19b44e268e5e4cb126d24627314cb1754f27a88c38c
  Associated tags:
 - 20190624-000015
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190624-000015
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190624-000015].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:4dd710d8d4897fd979d2e19b44e268e5e4cb126d24627314cb1754f27a88c38c].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3637

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3637/display/redirect>

------------------------------------------
[...truncated 207.45 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -209: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-187\x12\x04-185'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -209: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-187\x12\x04-185'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_11_08_30-17669516355737238729?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_11_15_29-3498389591110615362?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 915.695s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190623-180009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3994c9010fb74bc4429b86b41624439d5f19cd161a95927ee4352631ec62764
Deleted: sha256:81ecf17e4637020f6f7f761ed16641478f61eed6bf62d1301114661112dc53d1
Deleted: sha256:fcb0fe9e42c4658475ccd27ad97679ee143240ed2578782fa806980f26ae9017
Deleted: sha256:a1c0eb859ea3eac08b5615c4ba52f0ba42957f2dd283de5d6f21900d5e6af9d4
Deleted: sha256:a50dd658af67dc5d5139f55318def2bfd584d3a50f0b32c6e8e6c99ec0c23239
Deleted: sha256:3a02da46c1c300d0da4fe9ef0f36d04cfd6798aed94a6158a167b98a036f7a57
Deleted: sha256:16b2bae1931cae7d6d888a654c3e9045164d460894755f832b39ba773869fabd
Deleted: sha256:4628b5d4771e2584b75ebc08235cc00566bbe54e69e75bf3fd234538cbb65860
Deleted: sha256:82659b78fb322a190040129ad01713cdda3c92df3fa42984734a59a13d93ccd9
Deleted: sha256:a93f8a9fb70c2d0e53db65384b34d72de3b73b33a8c558c6032ed4c86216f113
Deleted: sha256:aea2bd07bbb9a29fa9e6278e44d2efba956d899da23ef8498a564bee3cbb204b
Deleted: sha256:1c8346e6d24043f56b7e9ce7f2cb0ffa9d27ed8bfb0daa21aa6a5cf84f105d63
Deleted: sha256:988be4f657e33b463924b96709c951e9a05362d2b9ca5ae58fff3ebbdc3c7856
Deleted: sha256:45f4357dda5e5eaba4b972617e328f678b9d0bb701a6fdda00da41af5e8b0cac
Deleted: sha256:33f53ee087ac575ac57a0162f5dcb67b619d6247c9ef8bfe5dce0e41093ed6cb
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3994c9010fb74bc4429b86b41624439d5f19cd161a95927ee4352631ec62764
  Associated tags:
 - 20190623-180009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190623-180009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190623-180009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3994c9010fb74bc4429b86b41624439d5f19cd161a95927ee4352631ec62764].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3636

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3636/display/redirect?page=changes>

Changes:

[ttanay100] Add to docstring of advance_watermark_to_infinity

------------------------------------------
[...truncated 207.93 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -350: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-292\x12\x04-290'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -350: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-292\x12\x04-290'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_07_05_34-1629505697491075308?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_07_12_17-14537997649736684417?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 764.606s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190623-135711
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b8dcd87dbb2f875a076b07eee264df84767417ea973bc2aeb6195df610e496ac
Deleted: sha256:30cee3f5a842e22ca48e5ac565b044669684c4065a76dd40de7c73098e88f286
Deleted: sha256:c8562738472adcca3514bb6128dda6d01958a794a2d39cf6fe0f762e52ec5881
Deleted: sha256:027a06b750bab779b75ec5b6a00083cc80f2535e933fbaa18f22606c47cc94f1
Deleted: sha256:166ec37a933c940cd6d8ea7b457f2dc3ed2dcb6076c466f68b660701e1a21117
Deleted: sha256:818c92725fe64c0ce895de41f150b0fab043cd1d6c45fdebb5ef26005376d184
Deleted: sha256:69e61a25a193d96954ec2e176a2f00e6e1b7a1ecbb5982a6f09d9c78e29dd63e
Deleted: sha256:7519d71722de5c5e58f0dfb18278f3716d3b691b99da21aa978af4f32e7780dd
Deleted: sha256:4a2519021763cd50ceb2f4ecc63dd56cb67b156f190a2333d597ab0fdf87a82c
Deleted: sha256:92796ff1c3503bbded1467a90143fd596ba4ae5ead9093ce5dea3d2d5ad6d2d6
Deleted: sha256:b885bf2a00ebde536de7e294cf23f76fd1f5ec0d47ae41018fa78f3eb8607f4b
Deleted: sha256:a560d8eaa530c5039d1c16520cb1005b5fc015a1ca8f9fdf80399e084fd17ce4
Deleted: sha256:d174baa999d46530965df98a7226e4ec8f92323046572a197b019e8a1723ec73
Deleted: sha256:077d921d859700344ce74209f984da5328a84f829817ea864f98fb428a23b430
Deleted: sha256:6b7d3729fd48d93d4eeb10780f1132191d0da2c0bc5126994fa2fbb74a7afbbf
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b8dcd87dbb2f875a076b07eee264df84767417ea973bc2aeb6195df610e496ac
  Associated tags:
 - 20190623-135711
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190623-135711
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190623-135711].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b8dcd87dbb2f875a076b07eee264df84767417ea973bc2aeb6195df610e496ac].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3635

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3635/display/redirect>

------------------------------------------
[...truncated 206.99 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -367: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-345\x12\x04-343'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_05_08_21-7165345381360091103?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -128: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-106\x12\x04-104'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -128: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-106\x12\x04-104'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-23_05_14_09-2250276220925929497?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 719.457s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190623-120014
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:405e73dda29d0b0fba8772ecb5249ba77bd9abe9eaa20465ce8d68d4e7df419f
Deleted: sha256:046e5dd82de4d546782892dc7808c8655fa551c8f978bd672cb62a0ad4376337
Deleted: sha256:e28e97037d1922faf3cee6174edbc2782b6a0ddc6faba7249e7781c951ce9540
Deleted: sha256:25f380befb64a252d5a25dbb427b3bada6fffde654ec80a5d05c8ea84c98a5eb
Deleted: sha256:d8d62037fb588f6521c409038fca3de655d8e5f07777b3b9452eca384cffd86d
Deleted: sha256:2ff64ff0d1d199d45278b7d6759f65dc83d4117e99487150f293bea7545a17f8
Deleted: sha256:d730c0c1e74a5a7b78f00e39015c7867cccd29edfa4a57c52a7a04305e8247ba
Deleted: sha256:84dea3a64326865984fabe78cf53197246442271fcc81bc2c619ca57fddc0f83
Deleted: sha256:fb3fa55101a5c353b006360016f0621fe0b3fcc18ccae7bc7615328d8552ce0c
Deleted: sha256:d6c225c70d8c8902ae18c6d4e49e1ad9bd4795b75b80c49c88b95b4427e12ef3
Deleted: sha256:3e09ae5a48c401e9837916b70c58cd5a9ec7f432b3fa83c97ee4534f637e82a9
Deleted: sha256:9110a78edd94941ea4a6bccb15d8adf0b010d944c600eb7a791f760edc6b67ce
Deleted: sha256:705fa5874f0c86c190f00cfa9ec2c066e607ec3eb8d7be449af519baff1f7742
Deleted: sha256:53b4ea5828b335243e70dec1f7d49cc8690b83f103669b9046f7e33d5297666c
Deleted: sha256:50091d79b6eb305e369c07fa677ac0d095bd0d38f3b1a9232dbf0f36bfcd3b1b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:405e73dda29d0b0fba8772ecb5249ba77bd9abe9eaa20465ce8d68d4e7df419f
  Associated tags:
 - 20190623-120014
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190623-120014
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190623-120014].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:405e73dda29d0b0fba8772ecb5249ba77bd9abe9eaa20465ce8d68d4e7df419f].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3634

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3634/display/redirect>

------------------------------------------
[...truncated 207.18 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_23_08_14-3343341276925936897?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_23_14_18-13091992611010855196?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 724.567s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190623-060009
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b90b7a0537844cde1d7eb1275e82156d256cddf35e5649edb4af12ad3d031840
Deleted: sha256:678708b05fed67cad76a172a3d27b0d7471ce73d394bb040fd202c2bd7b4c588
Deleted: sha256:90df98bb46be13879186d9a08d5397a2a37e762333da876064a9204caea7b194
Deleted: sha256:cc2d0ced7992625a6b7871e1d85f473ee10eab7342db068eb6cf34ce68180851
Deleted: sha256:317e55afc0454bd0542c6c8656689071c45f87c8a5c2b1190484611c97d72835
Deleted: sha256:4bf7fc4aa9bfd0b7abb562740d799190442f1443f380bf28a1bbb221a2133322
Deleted: sha256:9e66470acf116b955cedd7629db5aab6cf5bd2fa86f1b639b69d014d153a1bb6
Deleted: sha256:69f93c8d16529fad96f9cd3462843dc4f1e7831ddd70b770a6b1ef9766af7bba
Deleted: sha256:fbd862e73d3c5302c0b5ea1c4e38a03b9885fa844c9ce5ad043d19c0a845ed80
Deleted: sha256:1e72615ee419fbb724f9c64a3184a4eebfd6fb30ed11b83b03bb3a7ccca11e64
Deleted: sha256:013b2edce2025024d6147f6bd6bed1094b6496436f44656d038f203b65900769
Deleted: sha256:0b7cf09d43d5f8894ab194ad9602e6252927023834e1ebf41e07aae9d5a8c0b8
Deleted: sha256:0e312475ad9c2ffd119854afcd233c6474e0f310eff6bc0c64aa95f077e16f5b
Deleted: sha256:26492c143200eccbae88381d7919f1add9c9eb294a6b32033c1f34613f1ecb46
Deleted: sha256:6cfde9305a6736b1dbec0dd06f9ed1688c60b3d17b1f2e0794a517baccc1dc08
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b90b7a0537844cde1d7eb1275e82156d256cddf35e5649edb4af12ad3d031840
  Associated tags:
 - 20190623-060009
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190623-060009
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190623-060009].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b90b7a0537844cde1d7eb1275e82156d256cddf35e5649edb4af12ad3d031840].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3633

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3633/display/redirect>

------------------------------------------
[...truncated 207.74 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_17_08_46-10563883138235192212?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_17_16_20-10145413616641685506?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 834.908s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190623-000016
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:a11c49d644f68ed9c1c9695dd9ba2612df2faf7ad23d5a10727a16d48969ac40
Deleted: sha256:91602a7ddd3422f67afc64c1e6d6914af3d0c8535d9dec2f14f0188d5342dfa0
Deleted: sha256:c9784365dbfb38909d8bd9453c6f3c7d81fc5f29cee4191794cd7d7d5e121d0e
Deleted: sha256:11397d707c837735bbeb3e6b6dfb7a8425659e90b8418dd8ffccb5433b3d3d2b
Deleted: sha256:85a9d5a66ad66a1af67faf121051f4fbf45ce2d9158d147e23f35c8b5559800b
Deleted: sha256:2bdac2e1d1861bc48209e7674886a6c5a5066b172be206b6a8b97151bb64d4c4
Deleted: sha256:2e494c85f2481c7d7518f23f8417d8802e02f4177432b745490bbc0f236afe28
Deleted: sha256:e49872a4d72d68f0bb273a90b1b29bf3bd859bbc440e14992a9ce784492caacd
Deleted: sha256:5ad47da6ff81d0ab9e7bf02b986c86afcdcb4e4be26a02db11013c1587a55cbc
Deleted: sha256:b8997ff220be6f5c2eaf513615c2d91cb7fecaf6b62ac5769139b4a9b48d5db6
Deleted: sha256:df3a5f7c0191c72af03084ea2a28fce5aea84b40e31bfd9145ba35039345aa60
Deleted: sha256:0f8145f93f237f27bdc3c3cbbf0bee2e671c7859cf8bbc8903791e260454f74e
Deleted: sha256:6a8ab24597c46f7a8821672b8fd3b95dcc8d94879206cdbe2fbcbc71e0d01008
Deleted: sha256:f988ce98fe8f758f23c5a13457e5262b1da6ff21859ec5e1ef83d2e8a14ddea1
Deleted: sha256:f120453a4165423e6e1fbcbe4459bfcb27d349aa1b98314c739c7bdd0569381e
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:a11c49d644f68ed9c1c9695dd9ba2612df2faf7ad23d5a10727a16d48969ac40
  Associated tags:
 - 20190623-000016
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190623-000016
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190623-000016].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:a11c49d644f68ed9c1c9695dd9ba2612df2faf7ad23d5a10727a16d48969ac40].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3632

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3632/display/redirect>

------------------------------------------
[...truncated 207.10 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -366: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_11_08_41-3112595896940245019?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_11_15_10-11851044951095767320?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 754.999s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190622-180011
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:1a68b098522eae271eb0b3940ad5adf9b3a18c1eb694d3a3f74741687fe616da
Deleted: sha256:8345ecbc32a99ce51288392f743bfcc6c2e08c1227e6d186303e942da0fc33fd
Deleted: sha256:79b581bf295a6ed8495544b35e6e2d0a14acdfe21c520ece3f1824dce0def38a
Deleted: sha256:4f6f24a4d4f756971ef438149820028c4e6ba23d8b75db9ed726a516e3f350f0
Deleted: sha256:b6ee228e56b56548ac0245e297464d889638344666f30055ea6b95fb7b412d78
Deleted: sha256:877dd0baafa9414de8b21f8d7d0b5cbec6f37216611fb7c4b56bcef08a25f50d
Deleted: sha256:2e4af055ec2a12440adf25233fd117b515795ff4280ff9a9b0614558f41fddaf
Deleted: sha256:4ed8d1cea112bbb4d7220e31f488a9181e397c4dcabd919db5e54f396f4474bc
Deleted: sha256:dc751c6b7ea83635a8c7a668fd8200469c3a3adb400742779dcf0be3ebd6c8ca
Deleted: sha256:b77f2a35203880b2dfb8ed7d998697f9f4d5de8cda28761a10b144b3d29b20fb
Deleted: sha256:2063a5689c0e27620a4a3ae58388055c3d7eabbfee88e5943846dec99abd5029
Deleted: sha256:82c5288e14a0259e63b67c04f4fecbd4a5596a0b7f909f2873d5966506e09656
Deleted: sha256:238e95d75c5503b885aebff06ad946a58aa5de3ab1577f434ee9af4334f673da
Deleted: sha256:1ca6237cbe8e4bb45b1c518a3ad9a43b4a8145c5e0a016944d14763a85e535e3
Deleted: sha256:7603950388db2397175fef18a07ceb600412be4b7e4923016f91fe02bf250bd6
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:1a68b098522eae271eb0b3940ad5adf9b3a18c1eb694d3a3f74741687fe616da
  Associated tags:
 - 20190622-180011
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190622-180011
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190622-180011].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:1a68b098522eae271eb0b3940ad5adf9b3a18c1eb694d3a3f74741687fe616da].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3631

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3631/display/redirect?page=changes>

Changes:

[hsuryawirawan] Add README files on how to setup the project for both Java and Python

------------------------------------------
[...truncated 207.09 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -380: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-322\x12\x04-320'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_06_37_05-6068007472274069503?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_06_43_04-909530213045353579?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 775.267s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190622-132903
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:47b675d1942494b2beb7b070fdf775c40f023286c0c30657d81a1a7fc7274afb
Deleted: sha256:1714f789a83f9978c14e1a765c27536e232e7c10546d3269504a69a31b715d4b
Deleted: sha256:0612536113c0f88793fc1b55f540b7526138119189d4c9800d2f5ffc03f88f9e
Deleted: sha256:f3aca5f609f518062a020689806eab7247f5660f6e85be9b85a83135ddf60a02
Deleted: sha256:81de54fbfb76d91565aeb25bbc0ecfca35c598ae75612fa2063f6bca0cabb56e
Deleted: sha256:b8b41d01750ab9edeee66b73b56d3e263eaa8cc99814bbab8b984165da17829d
Deleted: sha256:004583804d8d33f4df566a6a84b8e9a4a3dab869653076f8ef12ed6d1a5091ee
Deleted: sha256:a969b629aa170f8ab495e33ef66221485b6c5b1ac560e4229465f3a0df912156
Deleted: sha256:a3a0bd9d2a302e3afcb020b4944f5826376722f41693f201a801c2a76461110c
Deleted: sha256:c5a5c34c0814f69040fda9bf0c1f1177388f1e9d9a8cc26c0771d36c8a8896ee
Deleted: sha256:354edb54e3d07590406f14332430e4a65a108c0f534e8c71c8336225db12a3dd
Deleted: sha256:6396e63f981f8b0387cce985811627a45b65ec5ef252dc5cb12e21433f4c6078
Deleted: sha256:1c4cb2ec0ca84dfb456d242f3260e0bba016778849d9bad37c5e5f32fb44bb64
Deleted: sha256:19c7a6fc11932b165246c1f16744b0f993cc12dedff165c716e1e75053cc966d
Deleted: sha256:373aec28cf0f50c6acbfeb00dee517d6c92e2cf0ab6d1408121699a05c8a9a2d
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:47b675d1942494b2beb7b070fdf775c40f023286c0c30657d81a1a7fc7274afb
  Associated tags:
 - 20190622-132903
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190622-132903
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190622-132903].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:47b675d1942494b2beb7b070fdf775c40f023286c0c30657d81a1a7fc7274afb].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3630

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3630/display/redirect>

------------------------------------------
[...truncated 206.91 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -348: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-290\x12\x04-288'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_05_08_50-13431212113384488463?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-22_05_15_43-8945750987961992908?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 915.020s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190622-120010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:2c28223e29850580559a236ac0097db8876bdab1fbed3148b6813ec9e0f72907
Deleted: sha256:859abf9d5bbdf3b232128ce1a2329ce33ce93d04838217344247cf2e6a52bd32
Deleted: sha256:fa6eb8d52146d0af2bcc3a5a5ff4147167036fee68b95b90e6698466fcc3ffa5
Deleted: sha256:060a97d03e443e97cc4a2fc29f7d3e288c331de456a73384216b6dcac384df9f
Deleted: sha256:a1a1570b18889aedb418d49b78398cd3f5e02aa2394d1791b9d5a9fa580ea6a8
Deleted: sha256:f47afc647f4667f67a3338fa9b5c8f89af78f86160132417a74843ba9b4b5423
Deleted: sha256:2bb08e66a2af7b87b130861c4b3b16b3d0bd304310497eb64ce9c20a23823faf
Deleted: sha256:03354948375d000e22ddf8a937cfcb9a928b047d94f451506df3e85db709561b
Deleted: sha256:2d23cc7d58159c7588ee6be5ea9b2fea65f7ec823fa238579becf6c7c67f14a0
Deleted: sha256:c92bed31c0cd48ab80db6c81dccf215781d7548c5ebeb50ad8e13563bd093c97
Deleted: sha256:92e7dff9cbbd7e3eca4dbb3521ce3f2552c76aabcd97945ac1e7e8ce3ac93129
Deleted: sha256:484bd605c9aea4d556de55732d79c5910e345f7f46d05168d6c4dd7b86a2d77f
Deleted: sha256:922f91335fce04baa4fbd21ade42a2700578f4a69ce9fa1acf2d6ba7bf5300ca
Deleted: sha256:1fd5ac59ad3a1f1124aa1c02a28d260d2d9a42e7f13d510c6e9eb03c530e971f
Deleted: sha256:2ee0e3e182ada6046f9ffbffe862578070ab6c68b053714b83e84bd4b9433142
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:2c28223e29850580559a236ac0097db8876bdab1fbed3148b6813ec9e0f72907
  Associated tags:
 - 20190622-120010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190622-120010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190622-120010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:2c28223e29850580559a236ac0097db8876bdab1fbed3148b6813ec9e0f72907].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3629

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3629/display/redirect>

------------------------------------------
[...truncated 207.04 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -381: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-323\x12\x04-321'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_08_49-5776542734772179856?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_23_14_53-4204662492400470476?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 759.699s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190622-060012
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3610f7f0f4a914a74b6f2b82c4ea2295571fce09009b3bc0c8153e973906544
Deleted: sha256:950de869541596998b444e3952da2867959a91c83a1cf92e218cdd509b316009
Deleted: sha256:8f88ba492845c960ec72bee90821942c43d5999d407b9bd7151e2e95ab20939f
Deleted: sha256:71333a669b2d0275a43a0ea9bfdfdb1072bd730564d9fba79602b5c53918386e
Deleted: sha256:f675693f35e6229c32c4f301e31fae3ee10d4f215299f1ae57fb6247ea43f073
Deleted: sha256:89cf9842493c09b3387e37c8a02280a9ed9e00acf5b6ab5a2d0ea04c6fe278be
Deleted: sha256:61e1351a880a4d92c7c14e3b3b3ed76c9d0d064f68aab2fad1d5212929c0a327
Deleted: sha256:9e3b7c16268134c3ae22135e9a327f4a4c8171553e6f0ee9d731b5bf7cb445fb
Deleted: sha256:edea39995aced776c7ccd765e6d4818c8ee74e130c8abf2a885d48d964d63c03
Deleted: sha256:9d885f5ee40b851d77fb5fd54a51bd174d6bb716225b1d82cd19f029fb521975
Deleted: sha256:5017de07d5d53ede7e2f956bd4f2a2237b155085d082ca01519c97afe2c56254
Deleted: sha256:e7200247885d2f76799a4b0ca98534a35278bb262a12148876b5712b84c5f7d1
Deleted: sha256:f1866defe918618b2d5c4ea43c4aeadad6f6f4c46f08ad09c5d319207fd2878f
Deleted: sha256:67b3345ac7fbc67ab97cbf2e4f91725be852a6e4c8758faf698ca0f1928f0870
Deleted: sha256:b1b2d595cccabebfae8a86fca7ffa257ba777dc0a655e41eeb64582b15c0a718
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3610f7f0f4a914a74b6f2b82c4ea2295571fce09009b3bc0c8153e973906544
  Associated tags:
 - 20190622-060012
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190622-060012
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190622-060012].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d3610f7f0f4a914a74b6f2b82c4ea2295571fce09009b3bc0c8153e973906544].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3628

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3628/display/redirect>

------------------------------------------
[...truncated 207.65 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -288: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-266\x12\x04-264'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_10_42-9966468686201303327?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_17_17_27-15541714320221730797?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 819.167s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190622-000030
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:c06422dcac2443fcd4f4ea6057c24f918254ae0687541d342b9ff0cadc208ef7
Deleted: sha256:d016f3576ca271267c098107a8751625bec0d186efdf4847a2fecc96edaa35f2
Deleted: sha256:efa5f25422abf43c9534aaecae2d5487d773af49e3f097ca819491b0bef2a170
Deleted: sha256:dc1a9f1f6fd3805aed7e587255e95926fcda68f7dd1d83ffce81a70dbaec5951
Deleted: sha256:7d196e3a4fd1fa1953b2cb85ab0bdd2a384c698be973f024e93a43b2db4aa47a
Deleted: sha256:7ff32b7e95257e30611eb4d345767190419ba872dfa330b1716b590678aebdb9
Deleted: sha256:bfc7e2e1f793910703e3e2d68e4029c291cf43eb66222472917619cb127b1df5
Deleted: sha256:48f083b8b99a36f5467dd9b9f1d54d7d613116ad1431e20ab44b0d947aa93859
Deleted: sha256:66f9388cbffbff588f465b3a7caa8c15e333aa85498a79b62b885210791c3e3e
Deleted: sha256:29202ff01ab167c10929a48efd73e90129976545d55eb43633f133d86f46b5e0
Deleted: sha256:bfbd1e6ed21ece23f62d05f4a126d553f948b3fd6f492b7c93a5b4a40e8d2827
Deleted: sha256:7dbda469a107b0887f1a295092e6955ceddd4427dea259aa9b01df396b036316
Deleted: sha256:c8ea0f664c789c52e971fcad311bfd391631ae8a937677dcc918ba297d8c511b
Deleted: sha256:aeaa71e473a54e98aa8de9c88e77fa42312b98c9a6d0b00642e5f06d550c3cb4
Deleted: sha256:c80b194a70b55f4ad73337c08139c819e62fc5e954a21c1728cff284abc78bc1
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:c06422dcac2443fcd4f4ea6057c24f918254ae0687541d342b9ff0cadc208ef7
  Associated tags:
 - 20190622-000030
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190622-000030
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190622-000030].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:c06422dcac2443fcd4f4ea6057c24f918254ae0687541d342b9ff0cadc208ef7].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3627

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3627/display/redirect?page=changes>

Changes:

[dcavazos] Add Python snippet for FlatMap transform

------------------------------------------
[...truncated 204.48 KB...]
copying apache_beam/transforms/window.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/transforms/write_ptransform_test.py -> apache-beam-2.15.0.dev0/apache_beam/transforms
copying apache_beam/typehints/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/opcodes.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> apache-beam-2.15.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
Exception in thread Thread-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 663, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-21_16_32_17-16914571766244873814?alt=json>: response: <{'status': '404', 'content-length': '280', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Fri, 21 Jun 2019 23:36:58 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 404,
    "message": "(42b019e232244dc0): Information about job 2019-06-21_16_32_17-16914571766244873814 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... FAIL
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -131: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-109\x12\x04-107'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -131: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-109\x12\x04-107'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_39_17-9911781733584428467?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
FAIL: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1331, in wait_until_finish
    'Job did not reach to a terminal state after waiting indefinitely.')
AssertionError: Job did not reach to a terminal state after waiting indefinitely.
-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_16_32_17-16914571766244873814?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 786.749s

FAILED (errors=1, failures=1)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190621-232454
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:d6fb010eb64a2c288891d0a64b0fab6126483e5e4bcb14d0ce1c926d67807601
Deleted: sha256:4587b29b1b850112490c0ef7bf0f338fd06f0491af45760f408f4c55199c6a23
Deleted: sha256:6faacb07f336851745bd6a744ea2f3f22abede5be8f44241fb16b5a5a397d05d
Deleted: sha256:61b70242cd9692574d0d1135ee3dd2f07d5e94529566bc544b9a5817c97f12d0
Deleted: sha256:19a1e0160469ca9c5e62ff2fd2b58988a12e6035412bfd38c87f1bf128b530f2
Deleted: sha256:9d277f3b026a390eda3b983241cca15a8ccfbef35ba82f0658f9d87f015c2be6
Deleted: sha256:12417092206e31c08b1eac53d5fad460bd0777a9225a12f1af242b95f45328c9
Deleted: sha256:22aeee6be7e4475b44e94cd18573b232ba2daea0c870509e3d47059c9bc92677
Deleted: sha256:241c478635ecb236734a19dc1e062fe93ea4f6e9f9f3774f6f99fc6d65db8097
Deleted: sha256:7e3dcda6227af2bbd5333b0f1845cc6f64636a17b11aa579a66007c3d2167e3f
Deleted: sha256:31fde390a92af4bd28a0b31f7cb5ebab1fd77642af52a0d4f611eaa5f97dd077
Deleted: sha256:d69eb7b93c622f5c478ee5e421bef5eabc630987f558ec22e80c8c9132aaaff6
Deleted: sha256:fd8db1e757e74076e854563c5e9147e4ad49855af013c91d74a0560189507ecb
Deleted: sha256:cd6023b8a4eb676c3da70d1b07a151d15fd810248fa9908303acdf9cfced466b
Deleted: sha256:803255a957cacfd722a08d370189e18016fdf10c5a6fcf4bc0e0afbba1e7761b
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:d6fb010eb64a2c288891d0a64b0fab6126483e5e4bcb14d0ce1c926d67807601
  Associated tags:
 - 20190621-232454
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190621-232454
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190621-232454].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:d6fb010eb64a2c288891d0a64b0fab6126483e5e4bcb14d0ce1c926d67807601].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3626

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3626/display/redirect?page=changes>

Changes:

[aaltay] [Beam-6696] GroupIntoBatches transform for Python SDK (#8914)

------------------------------------------
[...truncated 206.51 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -412: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-354\x12\x04-352'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_14_04_05-8413293371189759688?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_14_11_49-9226262644077760768?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 874.816s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190621-205653
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:26f56a6f53c04a156a713e0def2d5d2e96a5ef73787970a0d73577035b388df5
Deleted: sha256:cb7e6871bc248fae9b34a74e54361666615486424921569979365e18a41c4b5e
Deleted: sha256:39aaa8e068832beb6d1aa92909b8b4dd57db8c7d501d89115af3eb81fd28fc45
Deleted: sha256:80edd57d4933c36d5d529a2343447c38ebabec21f149c0965f8d1baf612b8ee9
Deleted: sha256:07e3e741016268d00d43d1af31e3f826170fbac24db47a1e10684abf6010d30f
Deleted: sha256:e0e09b123f26a8ca176ea592d4035032231c6336e6de3154775189dc3fc5e380
Deleted: sha256:fe62cee30a800f1cb4ff3b34d655b1dab9bac07267169739199b53f9433c706f
Deleted: sha256:6d4469c8e092b67577b503396b51ddabfe3cac72eddae22f21860180cee29698
Deleted: sha256:139d9164f0bfc961e3040e00c61e3bea7a4f2d84db240ef97fa2a1abd1f8f27b
Deleted: sha256:9e2396e79b98c97305fb81d4cde37c403b7bc4832e0f613f95624d37a611bf07
Deleted: sha256:8099032535bc590b3b54f71d8744cd205ec935b3d5a5430ce925de75fc048bc9
Deleted: sha256:77aca32ec10c18b6ab93ff07bd9710c6a4e65b0bd431c9b7fef2c372f09561c6
Deleted: sha256:01420c81417024bb9a0dbc215826d8bd4fd30fcb807e20346c91cec32e2f1e0e
Deleted: sha256:c2124f70f1e0a5898301196c5ad75c417086932f018b0559b71b4cc8e77ea1cb
Deleted: sha256:0daf666af819d82c8e3337ae38e4412743e4d41db41ae4840a8e16ae55be5bb1
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:26f56a6f53c04a156a713e0def2d5d2e96a5ef73787970a0d73577035b388df5
  Associated tags:
 - 20190621-205653
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190621-205653
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190621-205653].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:26f56a6f53c04a156a713e0def2d5d2e96a5ef73787970a0d73577035b388df5].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3625

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3625/display/redirect?page=changes>

Changes:

[github] Add a timeout to urlopen calls

------------------------------------------
[...truncated 206.51 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_11_55_18-17677420514361323070?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_12_01_31-15778003732774190268?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 779.999s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190621-184724
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:9c5532d1a96c2aa8a0e74432509fc09cf4acb31e3fe3f6c40b69be8fd583b4b2
Deleted: sha256:51c503101e5e10225adac201035ab5e1bad541afe82bfc80590457239caa1c07
Deleted: sha256:cb04ee6f1c11a25ad508aa0ff769e26f1bd0bde211d67581293ca84c0eecc116
Deleted: sha256:085a5f22dee7d30169da25c63af4dca178777b9431933aebf095e938064b9f0e
Deleted: sha256:34b5ad7cd058b27e153fa485096bc56bcc765dd9da4ea812ed7b4e6e62edcbe0
Deleted: sha256:23a3ad65ed69dcfb38f46e2e6857a5d9dc3f375b6520dee4771e60feef0de97a
Deleted: sha256:d0b8c71c3334c416cb1339f5119dafb5a6f8b20ab37d101067dd3c86ce40038b
Deleted: sha256:e47bb38df0755772766907848293745a6abf291ce407455c10d62f88d4c5b2d8
Deleted: sha256:6315cb23606e20034bf954a696fcfbe6ef1994b17185f06a78a5c15fbee1e39d
Deleted: sha256:1d1d0eb1a30559fb92149632647101f02a9b7ff4afb70f9a66f9606033268ad7
Deleted: sha256:f806de62fa88dc601c40373f6d02d7de85f4ddc346e89feb0dfb72866c5d7764
Deleted: sha256:8e32ee109d4c793ee13c744525fcfc21f4533edd495bb6da62bc50a36de206f2
Deleted: sha256:c3748b4214cd5289760da58f3cd42e2962416452d0e6de1e18005cf753988e30
Deleted: sha256:d1dcd048785168b550723c88e622eb556e9a7834abcf3938a6e2a67d1aa08bd3
Deleted: sha256:83467f0035440c5d728bd2d6a219167ff98a73e510ea2c672d9c21b81a0ccb3f
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:9c5532d1a96c2aa8a0e74432509fc09cf4acb31e3fe3f6c40b69be8fd583b4b2
  Associated tags:
 - 20190621-184724
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190621-184724
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190621-184724].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:9c5532d1a96c2aa8a0e74432509fc09cf4acb31e3fe3f6c40b69be8fd583b4b2].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3624

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3624/display/redirect>

------------------------------------------
[...truncated 206.24 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -318: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-260\x12\x04-258'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -318: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-260\x12\x04-258'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_11_07_24-9563007122316113374?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_11_14_52-1116317196786741661?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 879.566s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190621-180010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:b50c5ed3c34e22080154e20c15a700623adb3c4fd4bb2f21eade98f504499b19
Deleted: sha256:b0552102031125369b10fa6cc6ca373ba8521082796d62759c5c85794e11ffb6
Deleted: sha256:ef8f67dd62876a6c3bce03cb3a5b5300bb32d69c287b66563adc2308e63c50ac
Deleted: sha256:5a3359bbf1de30fd652f9092067c7af4ba0a1c48f5df5878fb27b9f0845ada53
Deleted: sha256:d89f0c3749fdf24109d49525c2cf5b0c6278fc2d9cf7ea58b6afcff980ce0c34
Deleted: sha256:a49ffb47822c5d5cbbe4ed723d0b58b9c47ac7da7c7327b296ad990ba0a981b4
Deleted: sha256:b19f38feaa463052e69b9d6f3249fbf6f181885d702ac639f91f49b252cbd8c5
Deleted: sha256:86044680ce9e370821e00a839c96d2ded9a50d47921cfd5a2455669e82228923
Deleted: sha256:1127be7a7c0693b3918c423f4e6486cde937a89231759c9812808a62c73a3a14
Deleted: sha256:2fff3b51f9c1f9dd8e0b7cf072fb67c652d8ca35b578bb63c4cf13729a86c510
Deleted: sha256:6c119175890645af1746d6f51dba77e54a60a0b97098fda70c87ac66458d594e
Deleted: sha256:822e72e291cf12da029188ac14f6daee1aae3640dc6bfd8f7a2c72a7d46c1abe
Deleted: sha256:140ea6a1bb5ed179cd2093b7aa6c3338069920238ef9edb7532524f99f7580de
Deleted: sha256:529fdae32736c7628d5c0671a85d8a9b15a08d0b039f6407d1f0d204b96e3384
Deleted: sha256:e6c25d948ad48e36af4a5c77cfab729e3c77750f80ec2fecf99ba78a8bacae49
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:b50c5ed3c34e22080154e20c15a700623adb3c4fd4bb2f21eade98f504499b19
  Associated tags:
 - 20190621-180010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190621-180010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190621-180010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:b50c5ed3c34e22080154e20c15a700623adb3c4fd4bb2f21eade98f504499b19].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3623

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3623/display/redirect>

------------------------------------------
[...truncated 207.05 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -209: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-187\x12\x04-185'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -209: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-187\x12\x04-185'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_05_08_47-11821919677780860510?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -129: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_05_15_50-14458064987991167289?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 904.532s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190621-120010
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:92354bd87f7e4d0ed38d3259099a629b2fedaae4162e784731b74afb6fe11ade
Deleted: sha256:e9b50b56ce03eef2680f06e19814553539f14d54ff0676509371b126bc8fc2e7
Deleted: sha256:298db140f3f970b81819ae5026beb1889b07784b32a09e9d3e7bef691df3006b
Deleted: sha256:0803fff9d9dad9899ac18ef99e4591c3a84e4935976b312d6d87056c52a2bb5d
Deleted: sha256:c0dc5b697fae4544ea2892815361133afb49e698c5a1f299215c9629e334e95f
Deleted: sha256:93cd090c42b277965c008f601ba46d69ee454d249d029d13a5c57b144be8be0d
Deleted: sha256:6efede3d96aecf5a8064c9da50d636b4e17c060a87cec420de96be8a96929372
Deleted: sha256:23c018c574e230e0488d727656ee39a1ca2f16017a120e0280dfc6a3740e404f
Deleted: sha256:42911707288c6fce1e3988259c10ee4fc54a75c25ac1b74178814d11e46dd7a7
Deleted: sha256:eba42bf0510bd703dae706da9da4a7503b1a3ce745e3a053036176ccb997f415
Deleted: sha256:9401db5dd64b05e75404a122c64e11b122231af1d54aecada8dae1dce6718746
Deleted: sha256:38faebd4c47849997614b7e1576e15e14b611731dde67aa3121c659775cd2db5
Deleted: sha256:1f11de526c39350d6c028f9757237362d9228908ffc4e28b647cf88ab2903557
Deleted: sha256:73aaada02b483887a06620c39483f5ff107cac600b8d3162bbb5d8b8e2a1c620
Deleted: sha256:5c8bfd7e3bd08e03815ba587e5c54621be8ff67deaefe40d88500b3369a0b7e7
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:92354bd87f7e4d0ed38d3259099a629b2fedaae4162e784731b74afb6fe11ade
  Associated tags:
 - 20190621-120010
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190621-120010
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190621-120010].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:92354bd87f7e4d0ed38d3259099a629b2fedaae4162e784731b74afb6fe11ade].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_ValCont #3622

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3622/display/redirect?page=changes>

Changes:

[mxm] Fixed typo and removed whitespace

------------------------------------------
[...truncated 206.48 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57: UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args: ['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 84, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -368: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-346\x12\x04-344'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -368: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-346\x12\x04-344'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_03_45_15-5207930641514040559?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 60, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",> line 48, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",> line 197, in apply_and_run
    result = pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 64, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1338, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
	at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
	at org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
	Suppressed: java.lang.IllegalStateException: Already closed.
		at org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
		at org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
		at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
		... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -130: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 593, in process_bundle
    data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-108\x12\x04-106'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-21_03_51_28-1082782328836773901?project=apache-beam-testing.

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 746.018s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190621-103809
Untagged: us.gcr.io/apache-beam-testing/jenkins/python@sha256:7d0fe9e504c3b4ac6c16523bcf3a8e6b3ea267701ed85e886106e1451dfe753f
Deleted: sha256:6758637c5e0116b0d6d472a9bec679f01207ecaade115e80cae92b623d3f7a3c
Deleted: sha256:1d14b39e6e0af273869bc17defee1e797bc7d81ddf54f3754b41feb3e1ca222b
Deleted: sha256:e6fb85bfe829b5f135b88bdec3c679127bcb971cf0c7a2c1c6af0bc08cce0449
Deleted: sha256:6a4de4742ca6c8753b526d3cc4d7835744e6aeb7d7c43e54c88724fd95d61b20
Deleted: sha256:af09eb44bc5546d39d3278729f7ed903c14cec68abf106d96b7472373b6c2ad2
Deleted: sha256:6a1a3a960bd0d9217f15a69fab09b7f6f860c196ace885c6b38e00a28f920cee
Deleted: sha256:306c9badc59b45585659ff87cdede35c8b00853b57d8f38f04e2018bfd059039
Deleted: sha256:1a42ec914f5363995aaaff51a1da72d1f2ed149256dd34c74264c4566de02e68
Deleted: sha256:2b57ea9772421dd399b116a84971786d055e2ba00765e692c5c9bd3697ba61a8
Deleted: sha256:e5e38cdc3dfde29fb0d09e1de61329d91d095642c3587afb090462e30e1d699a
Deleted: sha256:323e9b628a8c3820d7ba061d5aa97c7c14c20cb45c5b9f3dc5aba319f61be0b5
Deleted: sha256:75481e034b5e74d2957199c91eb36f94e42a6b0abc599b726d7ae6c704757755
Deleted: sha256:6f767d70062ecf3d90b4e15ac84bae12df061c8c687247a60878d02a704e4fda
Deleted: sha256:62713a6220085d7e54f83ad8d1ff57f0cb1d6cb8d1eebda91baea6aa13665b7f
Digests:
- us.gcr.io/apache-beam-testing/jenkins/python@sha256:7d0fe9e504c3b4ac6c16523bcf3a8e6b3ea267701ed85e886106e1451dfe753f
  Associated tags:
 - 20190621-103809
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190621-103809
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190621-103809].
Deleted [us.gcr.io/apache-beam-testing/jenkins/python@sha256:7d0fe9e504c3b4ac6c16523bcf3a8e6b3ea267701ed85e886106e1451dfe753f].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org