You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/04/07 12:29:21 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Spark #558

See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/558/display/redirect>

Changes:


------------------------------------------
[...truncated 961.12 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 106.523s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 28041.
Stopping expansion service pid: 28042.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 22974.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 14s
105 actionable tasks: 83 executed, 20 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/z4bhzlts57gtk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Spark #584

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/584/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #583

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/583/display/redirect?page=changes>

Changes:

[github] [Beam-9063]update documentation (#10952)


------------------------------------------
[...truncated 953.09 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 104.006s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 33m 6s
105 actionable tasks: 82 executed, 21 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yvpd34nbdczty

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #582

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/582/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-9716] Alias zone to worker_zone and warn user.


------------------------------------------
[...truncated 953.45 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 13: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 13: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 13: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 208.976s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 19s
105 actionable tasks: 87 executed, 16 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/oqjpmvonemlrs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #581

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/581/display/redirect?page=changes>

Changes:

[github] [BEAM-9618] Java SDK worker support for pulling bundle descriptors.


------------------------------------------
[...truncated 952.11 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 105.737s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 22s
105 actionable tasks: 83 executed, 20 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qmo4btprqcta2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #580

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/580/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-9322] [BEAM-1833] Better naming for composite transform output


------------------------------------------
[...truncated 951.98 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 111.932s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 43s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wgatzhddvftlq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #579

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/579/display/redirect?page=changes>

Changes:

[github] Update session.go (#11352)


------------------------------------------
[...truncated 950.90 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 12: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 12: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 12: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 106.330s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 7s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2x733atqdotcc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #578

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/578/display/redirect?page=changes>

Changes:

[crites] Updates documentation for WINDOWED_VALUE coder.

[mxm] [BEAM-9596] Ensure metrics are available in PipelineResult when the

[crites] Uses iterable coder for windows and copies all of timestamp encoding


------------------------------------------
[...truncated 951.54 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 113.310s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 26s
105 actionable tasks: 81 executed, 22 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kkhz46owxbcje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #577

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/577/display/redirect?page=changes>

Changes:

[github] Update session.go

[github] Update stage.go

[github] Update server_test.go

[github] Update materialize.go

[github] Update materialize_test.go

[github] Update stage_test.go

[github] Update artifact.go

[github] Update provision.go

[github] Update retrieval.go

[github] Update staging.go

[github] Update translate.go

[github] Update datamgr.go

[github] Update datamgr_test.go

[github] Update logging.go

[github] Update logging_test.go

[github] Update monitoring.go

[github] Update session.go

[github] Update statemgr.go

[github] Update statemgr_test.go

[github] Update replace.go

[github] Update replace_test.go

[github] Update provision.go

[github] Update execute.go

[github] Update job.go

[github] Update translate.go

[github] Update translate.go

[github] Update job.go

[github] Update materialize.go

[kcweaver] [BEAM-9714] [Go SDK] Require --region flag in Dataflow runner.

[github] Update translate.go

[github] Update session.go

[github] Update materialize_test.go


------------------------------------------
[...truncated 951.26 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 115.593s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 25s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/e5pn2l74eudea

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #576

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/576/display/redirect>

Changes:


------------------------------------------
[...truncated 951.54 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 106.629s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 40s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/iylby7iktrmfo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #575

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/575/display/redirect?page=changes>

Changes:

[mxm] [BEAM-9580] Allow Flink 1.10 processing timers to finish on pipeline

[mxm] Revert "[BEAM-9580] Downgrade Flink version to 1.9 for Nexmark and

[mxm] [BEAM-9557] Fix strings used to verify test output


------------------------------------------
[...truncated 955.13 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 103.171s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 29896.
Stopping expansion service pid: 29897.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 24763.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 21s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hop4b7e6dwb32

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #574

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/574/display/redirect?page=changes>

Changes:

[github] [BEAM-9147] Add a VideoIntelligence transform to Java SDK (#11261)


------------------------------------------
[...truncated 953.98 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 101.982s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 27628.
Stopping expansion service pid: 27629.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 359.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 30s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p2ohvrb7koyza

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #573

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/573/display/redirect>

Changes:


------------------------------------------
[...truncated 954.02 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 103.370s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 9998.
Stopping expansion service pid: 9999.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 5577.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 16s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fmw7z3vti5dpo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #572

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/572/display/redirect>

Changes:


------------------------------------------
[...truncated 951.59 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 106.774s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 37m 21s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zs54ik72ycyki

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #571

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/571/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-5422] Document DynamicDestinations.getTable uniqueness requirement


------------------------------------------
[...truncated 952.64 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 108.833s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 3575.
Stopping expansion service pid: 3576.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 4967.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 32s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qkx72c3mik47a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #570

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/570/display/redirect?page=changes>

Changes:

[github] [BEAM-9529] Remove datastore.v1, googledatastore (#11175)


------------------------------------------
[...truncated 952.46 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 110.491s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 39s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6o6y6ubnsarwk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #569

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/569/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-9577] Plumb resources through Python job service and runner.


------------------------------------------
[...truncated 959.44 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 112.059s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 32497.
Stopping expansion service pid: 32498.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 26097.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 43s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/57tzmjze6kltq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #568

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/568/display/redirect>

Changes:


------------------------------------------
[...truncated 957.88 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 103.866s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 0s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ut4siyxepmmpi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #567

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/567/display/redirect?page=changes>

Changes:

[robertwb] Update go protos.

[robertwb] [BEAM-9618] Pull bundle descriptors for Go.


------------------------------------------
[...truncated 962.51 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 105.823s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 26751.
Stopping expansion service pid: 26752.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 4353.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 46s
105 actionable tasks: 86 executed, 17 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2uawyo52gxt2q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #566

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/566/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-9691] Ensuring BQSource is avoided on FnApi


------------------------------------------
[...truncated 960.25 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 105.262s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 8465.
Stopping expansion service pid: 8466.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 30459.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 38s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xxn23twt7xrxc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #565

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/565/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-9707] Hardcode Unified harness image for fixing dataflow VR 2

[github] Fix some Go SDK linter/vet warnings. (#11330)


------------------------------------------
[...truncated 960.81 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 111.353s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 5694.
Stopping expansion service pid: 5695.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 1396.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 16s
105 actionable tasks: 82 executed, 21 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cayuzsrso2lw4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #564

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/564/display/redirect?page=changes>

Changes:

[github] Merge pull request #11205 [BEAM-9578] Defer expensive artifact


------------------------------------------
[...truncated 955.37 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 16: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 103.040s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 45s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/apuzzfre6ai4m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #563

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/563/display/redirect?page=changes>

Changes:

[rohde.samuel] Fix flaky interactive_runner_test


------------------------------------------
[...truncated 957.55 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 15: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 110.959s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 34s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jai3bcguar5ig

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #562

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/562/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-9715] Ensuring annotations_test passes in all


------------------------------------------
[...truncated 959.71 KB...]
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 12: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 12: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 12: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 203.057s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED

> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 19275.
Stopping expansion service pid: 19276.

> Task :runners:spark:job-server:sparkJobServerCleanup
Stopping job server pid: 31168.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 33m 3s
105 actionable tasks: 80 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cas2ffksu2j2k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #561

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/561/display/redirect?page=changes>

Changes:

[github] Name the pipeline_v1 proto import

[github] Update materialize_test.go


------------------------------------------
[...truncated 959.07 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 108.264s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 34s
105 actionable tasks: 84 executed, 19 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pe32j4ktlu526

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #560

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/560/display/redirect?page=changes>

Changes:

[github] Merge pull request #11244 from [BEAM-3097] _ReadFromBigQuery supports


------------------------------------------
[...truncated 959.23 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 212.849s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 59s
105 actionable tasks: 85 executed, 18 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tezjvj5xblk5o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #559

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/559/display/redirect?page=changes>

Changes:

[github] Merge pull request #11226: [BEAM-9557] Fix timer window boundary


------------------------------------------
[...truncated 958.85 KB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:511)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:189)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:211)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:184)
	at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:178)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 14: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 245, in _execute
    response = task()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 500, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 374, in get
    self.data_channel_factory)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 754, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 807, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 790, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 715, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 793, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1089, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1413, in create_par_do
    parameter)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1449, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/site-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/transforms/validate_runner_xlang_test.py", line 24, in <module>
    from nose.plugins.attrib import attr
ImportError: No module named nose.plugins.attrib

apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 105.662s

FAILED (errors=2)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 29s
105 actionable tasks: 86 executed, 17 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6ebpoloud32fa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org