You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/03 13:02:00 UTC

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #1

See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/1/display/redirect>

------------------------------------------
[...truncated 108.46 KB...]
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, docopt, urllib3, certifi, chardet, idna, requests, hdfs, httplib2, pbr, funcsigs, mock, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, pytz, pyyaml, mmh3, avro, pyvcf, typing, numpy, pyarrow, cachetools, monotonic, fasteners, google-apitools, googleapis-common-protos, google-auth, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, proto-google-cloud-datastore-v1, googledatastore, nose, python-dateutil, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-1.9.0 cachetools-3.1.1 certifi-2019.3.9 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 funcsigs-1.0.2 google-api-core-1.11.1 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.6.1 google-cloud-bigtable-0.32.2 google-cloud-core-0.29.1 google-cloud-datastore-1.7.4 google-cloud-pubsub-0.39.1 google-resumable-media-0.3.2 googleapis-common-protos-1.6.0 googledatastore-7.0.2 grpc-google-iam-v1-0.11.4 hdfs-2.5.2 httplib2-0.12.0 idna-2.8 mmh3-2.5.1 mock-2.0.0 monotonic-1.5 nose-1.3.7 numpy-1.16.4 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.2.1 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.13.0 pyasn1-0.4.5 pyasn1-modules-0.2.5 pydot-1.2.4 pyhamcrest-1.9.0 pyparsing-2.4.0 python-dateutil-2.8.0 pytz-2019.1 pyvcf-0.6.8 pyyaml-3.13 requests-2.22.0 rsa-4.0 tenacity-5.0.4 typing-3.6.6 urllib3-1.25.3

> Task :sdks:python:apache_beam:testing:load_tests:run
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ERROR

======================================================================
ERROR: testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 65, in tearDown
    result = self.pipeline.run()
  File "<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 109, in run
    state = result.wait_until_finish()
  File "<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 436, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline load_tests_Python_Flink_Batch_GBK_1_0603083446_00756ec3-6386-44a3-8419-77557e0c96d6 failed in state FAILED: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 589, in process_bundle
    ].process_encoded(data.data)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 142, in process_encoded
    input_stream, True)
  File "apache_beam/coders/coder_impl.py", line 988, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    def decode_from_stream(self, in_stream, nested):
  File "apache_beam/coders/coder_impl.py", line 1006, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    value = self._value_coder.decode_from_stream(in_stream, nested)
  File "apache_beam/coders/coder_impl.py", line 1051, in apache_beam.coders.coder_impl.LengthPrefixCoderImpl.decode_from_stream
    return self._value_coder.decode(in_stream.read(value_length))
  File "apache_beam/coders/coder_impl.py", line 173, in apache_beam.coders.coder_impl.StreamCoderImpl.decode
    return self.decode_from_stream(create_InputStream(encoded), False)
  File "apache_beam/coders/coder_impl.py", line 419, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
    return self.fallback_coder_impl.decode_from_stream(stream, nested)
  File "apache_beam/coders/coder_impl.py", line 203, in apache_beam.coders.coder_impl.CallbackCoderImpl.decode_from_stream
    return self._decoder(stream.read_all(nested))
AttributeError: 'module' object has no attribute 'SyntheticSource'

-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: INFO: Metrics will not be collected
root: INFO: ==================== <function lift_combiners at 0x7fca1eeba668> ====================
root: DEBUG: 12 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read/Impulse_3\n  Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Split_4\n  Read/Split:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/AddRandomKeys_6\n  Read/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n  Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/GroupByKey_9\n  Read/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n  Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/RemoveRandomKeys_14\n  Read/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/ReadSplits_15\n  Read/ReadSplits:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start_16\n  Measure time: Start:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 0_17\n  GroupByKey 0:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 0_21\n  Ungroup 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 0_22\n  Measure time: End 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7fca1eeba6e0> ====================
root: DEBUG: 12 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read/Impulse_3\n  Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Split_4\n  Read/Split:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/AddRandomKeys_6\n  Read/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n  Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/GroupByKey_9\n  Read/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n  Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/RemoveRandomKeys_14\n  Read/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/ReadSplits_15\n  Read/ReadSplits:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start_16\n  Measure time: Start:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 0_17\n  GroupByKey 0:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 0_21\n  Ungroup 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 0_22\n  Measure time: End 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Runner option 'job_name' was already added
root: DEBUG: Runner option 'runner' was already added
root: DEBUG: Runner option 'temp_location' was already added
root: DEBUG: Runner option 'streaming' was already added
root: DEBUG: Runner option 'dataflow_kms_key' was already added
root: DEBUG: Runner option 'enable_streaming_engine' was already added
root: DEBUG: Runner option 'project' was already added
root: DEBUG: Runner option 'zone' was already added
root: DEBUG: Runner option 'environment_cache_millis' was already added
root: DEBUG: Runner option 'files_to_stage' was already added
root: DEBUG: Runner option 'job_endpoint' was already added
root: DEBUG: Runner option 'sdk_worker_parallelism' was already added
root: WARNING: Discarding unparseable args: ['--publish_to_big_query=false', '--metrics_dataset=load_test', '--metrics_table=python_flink_batch_GBK_1', '--input_options={"num_records": 200000000,"key_size": 1,"value_size":9}', '--iterations=1', '--fanout=1']
root: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 99d966747951f36a0d84c94c27f36b74)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:487)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:475)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:450)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:210)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:187)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:200)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:87)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:73)
	at org.apache.beam.vendor.guava.v20_0.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:111)
	at org.apache.beam.vendor.guava.v20_0.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:58)
	at org.apache.beam.vendor.guava.v20_0.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:75)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 589, in process_bundle
    ].process_encoded(data.data)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 142, in process_encoded
    input_stream, True)
  File "apache_beam/coders/coder_impl.py", line 988, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    def decode_from_stream(self, in_stream, nested):
  File "apache_beam/coders/coder_impl.py", line 1006, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    value = self._value_coder.decode_from_stream(in_stream, nested)
  File "apache_beam/coders/coder_impl.py", line 1051, in apache_beam.coders.coder_impl.LengthPrefixCoderImpl.decode_from_stream
    return self._value_coder.decode(in_stream.read(value_length))
  File "apache_beam/coders/coder_impl.py", line 173, in apache_beam.coders.coder_impl.StreamCoderImpl.decode
    return self.decode_from_stream(create_InputStream(encoded), False)
  File "apache_beam/coders/coder_impl.py", line 419, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
    return self.fallback_coder_impl.decode_from_stream(stream, nested)
  File "apache_beam/coders/coder_impl.py", line 203, in apache_beam.coders.coder_impl.CallbackCoderImpl.decode_from_stream
    return self._decoder(stream.read_all(nested))
AttributeError: 'module' object has no attribute 'SyntheticSource'

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:263)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:209)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:209)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
	... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 589, in process_bundle
    ].process_encoded(data.data)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 142, in process_encoded
    input_stream, True)
  File "apache_beam/coders/coder_impl.py", line 988, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    def decode_from_stream(self, in_stream, nested):
  File "apache_beam/coders/coder_impl.py", line 1006, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    value = self._value_coder.decode_from_stream(in_stream, nested)
  File "apache_beam/coders/coder_impl.py", line 1051, in apache_beam.coders.coder_impl.LengthPrefixCoderImpl.decode_from_stream
    return self._value_coder.decode(in_stream.read(value_length))
  File "apache_beam/coders/coder_impl.py", line 173, in apache_beam.coders.coder_impl.StreamCoderImpl.decode
    return self.decode_from_stream(create_InputStream(encoded), False)
  File "apache_beam/coders/coder_impl.py", line 419, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
    return self.fallback_coder_impl.decode_from_stream(stream, nested)
  File "apache_beam/coders/coder_impl.py", line 203, in apache_beam.coders.coder_impl.CallbackCoderImpl.decode_from_stream
    return self._decoder(stream.read_all(nested))
AttributeError: 'module' object has no attribute 'SyntheticSource'

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 157, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 342, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 368, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 589, in process_bundle
    ].process_encoded(data.data)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 142, in process_encoded
    input_stream, True)
  File "apache_beam/coders/coder_impl.py", line 988, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    def decode_from_stream(self, in_stream, nested):
  File "apache_beam/coders/coder_impl.py", line 1006, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.decode_from_stream
    value = self._value_coder.decode_from_stream(in_stream, nested)
  File "apache_beam/coders/coder_impl.py", line 1051, in apache_beam.coders.coder_impl.LengthPrefixCoderImpl.decode_from_stream
    return self._value_coder.decode(in_stream.read(value_length))
  File "apache_beam/coders/coder_impl.py", line 173, in apache_beam.coders.coder_impl.StreamCoderImpl.decode
    return self.decode_from_stream(create_InputStream(encoded), False)
  File "apache_beam/coders/coder_impl.py", line 419, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
    return self.fallback_coder_impl.decode_from_stream(stream, nested)
  File "apache_beam/coders/coder_impl.py", line 203, in apache_beam.coders.coder_impl.CallbackCoderImpl.decode_from_stream
    return self._decoder(stream.read_all(nested))
AttributeError: 'module' object has no attribute 'SyntheticSource'

root: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 24.923s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 52

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56s
3 actionable tasks: 3 executed

Publishing build scan...
https://gradle.com/s/fwrjj6cvso5du

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_GBK_Flink_Batch #2

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/2/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org