You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@beam.apache.org by Benjamin Tan <be...@gmail.com> on 2019/09/17 02:49:26 UTC

How to use the loopback?

I'm trying to use the loopback via the environment_type option:

options = PipelineOptions(["--runner=PortableRunner",
                                           "--environment_config=-apachebeam/python3.7_sdk ",
                                           "--environment_type=LOOPBACK",
                                           "--job_endpoint=dnnserver2:8099"])

Previouly, I've done:

./gradlew -p sdks/python/container buildAll

And ran the Spark job server:

./gradlew :runners:spark:job-server:runShadow -PsparkMasterUrl=spark://dnnserver2:7077

However, I get a pretty cryptic error message:

org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!

Any ideas?



Re: How to use the loopback?

Posted by Kyle Weaver <kc...@google.com>.
The StopWorker "errors" are harmless and there's an easy patch for them:
https://github.com/apache/beam/pull/9600

(I already included this in my reply on the other thread, but putting it
down here for the record)

Kyle Weaver | Software Engineer | github.com/ibzib | kcweaver@google.com


On Mon, Sep 16, 2019 at 11:25 PM Benjamin Tan <be...@gmail.com>
wrote:

> Some updates!
>
> I built and install the beam 2.16.0 python library for both Python 2 and
> Python 3.
>
> Python 3 leaves me with the same error.
>
> However, apparently Python 2 works. I can see the generated output.
> However, I get the following error:
>
> Any ideas? I'm not sure why this is so hard.
>
> /home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/__init__.py:84:
> UserWarning: You are using Apache Beam with Python 2. New releases of
> Apache Beam will soon support Python 3 only.
>   'You are using Apache Beam with Python 2. '
> ERROR:grpc._server:Exception calling application: u'1-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'1-1'
> ERROR:grpc._server:Exception calling application: u'2-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'2-1'
> ERROR:grpc._server:Exception calling application: u'4-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'4-1'
> ERROR:grpc._server:Exception calling application: u'3-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'3-1'
> ERROR:grpc._server:Exception calling application: u'6-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'6-1'
> ERROR:grpc._server:Exception calling application: u'5-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'5-1'
> ERROR:grpc._server:Exception calling application: u'7-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'7-1'
> ERROR:grpc._server:Exception calling application: u'8-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'8-1'
> ERROR:grpc._server:Exception calling application: u'10-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'10-1'
> ERROR:grpc._server:Exception calling application: u'9-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'9-1'
> ERROR:grpc._server:Exception calling application: u'12-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'12-1'
> ERROR:grpc._server:Exception calling application: u'11-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'11-1'
> ERROR:grpc._server:Exception calling application: u'14-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'14-1'
> ERROR:grpc._server:Exception calling application: u'13-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'13-1'
> ERROR:grpc._server:Exception calling application: u'17-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'17-1'
> ERROR:grpc._server:Exception calling application: u'15-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'15-1'
> ERROR:grpc._server:Exception calling application: u'16-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'16-1'
> ERROR:grpc._server:Exception calling application: u'21-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'21-1'
> ERROR:grpc._server:Exception calling application: u'23-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'23-1'
> ERROR:grpc._server:Exception calling application: u'19-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'19-1'
> ERROR:grpc._server:Exception calling application: u'20-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'20-1'
> ERROR:grpc._server:Exception calling application: u'18-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'18-1'
> ERROR:grpc._server:Exception calling application: u'22-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'22-1'
> ERROR:grpc._server:Exception calling application: u'25-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'25-1'
> ERROR:grpc._server:Exception calling application: u'24-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'24-1'
> ERROR:grpc._server:Exception calling application: u'27-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'27-1'
> ERROR:grpc._server:Exception calling application: u'26-1'
> Traceback (most recent call last):
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py",
> line 434, in _call_behavior
>     response_or_iterator = behavior(argument, context)
>   File
> "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py",
> line 126, in StopWorker
>     worker_process =
> self._worker_processes.pop(stop_worker_request.worker_id)
> KeyError: u'26-1'
>
>
> On 2019/09/17 04:57:55, Benjamin Tan <be...@gmail.com> wrote:
> > Here you go!
> >
> > builder@dnnserver2:~/beam (release-2.16.0) $ ./gradlew
> :runners:spark:job-server:runShadow -PsparkMasterUrl=spark://dnnserver2:7077
> > Configuration on demand is an incubating feature.
> >
> > > Task :runners:spark:job-server:runShadow
> > Listening for transport dt_socket at address: 5005
> > log4j:WARN No appenders could be found for logger
> (org.apache.beam.vendor.grpc.v1p21p0.io.netty.util.internal.logging.InternalLoggerFactory).
> > log4j:WARN Please initialize the log4j system properly.
> > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
> for more info.
> > Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> > 19/09/17 12:57:06 INFO SparkContext: Running Spark version 2.4.4
> > 19/09/17 12:57:06 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> > 19/09/17 12:57:06 INFO SparkContext: Submitted application:
> BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc
> > 19/09/17 12:57:06 INFO SecurityManager: Changing view acls to: builder
> > 19/09/17 12:57:06 INFO SecurityManager: Changing modify acls to: builder
> > 19/09/17 12:57:06 INFO SecurityManager: Changing view acls groups to:
> > 19/09/17 12:57:06 INFO SecurityManager: Changing modify acls groups to:
> > 19/09/17 12:57:06 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users  with view permissions: Set(builder);
> groups with view permissions: Set(); users  with modify permissions:
> Set(builder); groups with modify permissions: Set()
> > 19/09/17 12:57:07 INFO Utils: Successfully started service 'sparkDriver'
> on port 36069.
> > 19/09/17 12:57:07 INFO SparkEnv: Registering MapOutputTracker
> > 19/09/17 12:57:07 INFO SparkEnv: Registering BlockManagerMaster
> > 19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: Using
> org.apache.spark.storage.DefaultTopologyMapper for getting topology
> information
> > 19/09/17 12:57:07 INFO BlockManagerMasterEndpoint:
> BlockManagerMasterEndpoint up
> > 19/09/17 12:57:07 INFO DiskBlockManager: Created local directory at
> /tmp/blockmgr-92f6079e-4a85-4b09-b48b-5d58ddf304a6
> > 19/09/17 12:57:07 INFO MemoryStore: MemoryStore started with capacity
> 1949.1 MB
> > 19/09/17 12:57:07 INFO SparkEnv: Registering OutputCommitCoordinator
> > 19/09/17 12:57:07 INFO Utils: Successfully started service 'SparkUI' on
> port 4040.
> > 19/09/17 12:57:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at
> http://dnnserver2:4040
> > 19/09/17 12:57:07 INFO SparkContext: Added JAR
> /home/builder/beam/runners/spark/job-server/build/install/job-server-shadow/lib/beam-runners-spark-job-server-2.16.0-SNAPSHOT.jar
> at
> spark://dnnserver2:36069/jars/beam-runners-spark-job-server-2.16.0-SNAPSHOT.jar
> with timestamp 1568696227623
> > 19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Connecting to
> master spark://dnnserver2:7077...
> > 19/09/17 12:57:07 INFO TransportClientFactory: Successfully created
> connection to dnnserver2/10.64.1.208:7077 after 40 ms (0 ms spent in
> bootstraps)
> > 19/09/17 12:57:07 INFO StandaloneSchedulerBackend: Connected to Spark
> cluster with app ID app-20190917125707-0066
> > 19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Executor
> added: app-20190917125707-0066/0 on worker-20190916143324-10.64.1.208-41823
> (10.64.1.208:41823) with 12 core(s)
> > 19/09/17 12:57:07 INFO StandaloneSchedulerBackend: Granted executor ID
> app-20190917125707-0066/0 on hostPort 10.64.1.208:41823 with 12 core(s),
> 1024.0 MB RAM
> > 19/09/17 12:57:07 INFO Utils: Successfully started service
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37069.
> > 19/09/17 12:57:07 INFO NettyBlockTransferService: Server created on
> dnnserver2:37069
> > 19/09/17 12:57:07 INFO BlockManager: Using
> org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
> policy
> > 19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Executor
> updated: app-20190917125707-0066/0 is now RUNNING
> > 19/09/17 12:57:07 INFO BlockManagerMaster: Registering BlockManager
> BlockManagerId(driver, dnnserver2, 37069, None)
> > 19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: Registering block
> manager dnnserver2:37069 with 1949.1 MB RAM, BlockManagerId(driver,
> dnnserver2, 37069, None)
> > 19/09/17 12:57:07 INFO BlockManagerMaster: Registered BlockManager
> BlockManagerId(driver, dnnserver2, 37069, None)
> > 19/09/17 12:57:07 INFO BlockManager: Initialized BlockManager:
> BlockManagerId(driver, dnnserver2, 37069, None)
> > 19/09/17 12:57:07 INFO StandaloneSchedulerBackend: SchedulerBackend is
> ready for scheduling beginning after reached minRegisteredResourcesRatio:
> 0.0
> > 19/09/17 12:57:07 INFO SparkPipelineRunner: Running job
> BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc on
> Spark master spark://dnnserver2:7077
> > 19/09/17 12:57:07 INFO AggregatorsAccumulator: Instantiated aggregators
> accumulator:
> > 19/09/17 12:57:08 INFO MetricsAccumulator: Instantiated metrics
> accumulator: MetricQueryResults()
> > 19/09/17 12:57:08 WARN GroupNonMergingWindowsFunctions: Either coder
> LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent
> with equals. That might cause issues on some runners.
> > 19/09/17 12:57:08 INFO SparkContext: Starting job: collect at
> BoundedDataset.java:76
> > 19/09/17 12:57:08 INFO DAGScheduler: Got job 0 (collect at
> BoundedDataset.java:76) with 2 output partitions
> > 19/09/17 12:57:08 INFO DAGScheduler: Final stage: ResultStage 0 (collect
> at BoundedDataset.java:76)
> > 19/09/17 12:57:08 INFO DAGScheduler: Parents of final stage: List()
> > 19/09/17 12:57:08 INFO DAGScheduler: Missing parents: List()
> > 19/09/17 12:57:08 INFO DAGScheduler: Submitting ResultStage 0
> (MapPartitionsRDD[24] at map at BoundedDataset.java:75), which has no
> missing parents
> > 19/09/17 12:57:08 INFO MemoryStore: Block broadcast_0 stored as values
> in memory (estimated size 29.3 KB, free 1949.1 MB)
> > 19/09/17 12:57:08 INFO MemoryStore: Block broadcast_0_piece0 stored as
> bytes in memory (estimated size 11.0 KB, free 1949.1 MB)
> > 19/09/17 12:57:08 INFO BlockManagerInfo: Added broadcast_0_piece0 in
> memory on dnnserver2:37069 (size: 11.0 KB, free: 1949.1 MB)
> > 19/09/17 12:57:08 INFO SparkContext: Created broadcast 0 from broadcast
> at DAGScheduler.scala:1161
> > 19/09/17 12:57:08 INFO DAGScheduler: Submitting 2 missing tasks from
> ResultStage 0 (MapPartitionsRDD[24] at map at BoundedDataset.java:75)
> (first 15 tasks are for partitions Vector(0, 1))
> > 19/09/17 12:57:08 INFO TaskSchedulerImpl: Adding task set 0.0 with 2
> tasks
> > 19/09/17 12:57:10 INFO CoarseGrainedSchedulerBackend$DriverEndpoint:
> Registered executor NettyRpcEndpointRef(spark-client://Executor) (
> 10.64.1.208:41406) with ID 0
> > 19/09/17 12:57:10 INFO TaskSetManager: Starting task 0.0 in stage 0.0
> (TID 0, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> > 19/09/17 12:57:10 INFO TaskSetManager: Starting task 1.0 in stage 0.0
> (TID 1, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> > 19/09/17 12:57:10 INFO BlockManagerMasterEndpoint: Registering block
> manager 10.64.1.208:43075 with 366.3 MB RAM, BlockManagerId(0,
> 10.64.1.208, 43075, None)
> > 19/09/17 12:57:12 INFO BlockManagerInfo: Added broadcast_0_piece0 in
> memory on 10.64.1.208:43075 (size: 11.0 KB, free: 366.3 MB)
> > 19/09/17 12:57:15 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID
> 1, 10.64.1.208, executor 0):
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
> >         at
> org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> >         at org.apache.spark.scheduler.Task.run(Task.scala:123)
> >         at
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
> >         at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
> >         at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
> >         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >         at java.lang.Thread.run(Thread.java:748)
> > Caused by:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
> >         at
> org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
> >         at
> org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
> >         ... 54 more
> >
> > 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID
> 0) on 10.64.1.208, executor 0:
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException
> (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!) [duplicate 1]
> > 19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.1 in stage 0.0
> (TID 2, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> > 19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.1 in stage 0.0
> (TID 3, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> > 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID
> 2) on 10.64.1.208, executor 0:
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException
> (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!) [duplicate 2]
> > 19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.2 in stage 0.0
> (TID 4, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> > 19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID
> 3) on 10.64.1.208, executor 0:
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException
> (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!) [duplicate 3]
> > 19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.2 in stage 0.0
> (TID 5, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> > 19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID
> 5) on 10.64.1.208, executor 0:
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException
> (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!) [duplicate 4]
> > 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID
> 4) on 10.64.1.208, executor 0:
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException
> (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!) [duplicate 5]
> > 19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.3 in stage 0.0
> (TID 6, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> > 19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.3 in stage 0.0
> (TID 7, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> > 19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID
> 7) on 10.64.1.208, executor 0:
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException
> (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!) [duplicate 6]
> > 19/09/17 12:57:15 ERROR TaskSetManager: Task 1 in stage 0.0 failed 4
> times; aborting job
> > 19/09/17 12:57:15 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose
> tasks have all completed, from pool
> > 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID
> 6) on 10.64.1.208, executor 0:
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException
> (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!) [duplicate 7]
> > 19/09/17 12:57:15 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose
> tasks have all completed, from pool
> > 19/09/17 12:57:15 INFO TaskSchedulerImpl: Cancelling stage 0
> > 19/09/17 12:57:15 INFO TaskSchedulerImpl: Killing all running tasks in
> stage 0: Stage cancelled
> > 19/09/17 12:57:15 INFO DAGScheduler: ResultStage 0 (collect at
> BoundedDataset.java:76) failed in 7.248 s due to Job aborted due to stage
> failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task
> 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0):
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
> >         at
> org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> >         at org.apache.spark.scheduler.Task.run(Task.scala:123)
> >         at
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
> >         at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
> >         at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
> >         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >         at java.lang.Thread.run(Thread.java:748)
> > Caused by:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
> >         at
> org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
> >         at
> org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
> >         ... 54 more
> >
> > Driver stacktrace:
> > 19/09/17 12:57:15 INFO DAGScheduler: Job 0 failed: collect at
> BoundedDataset.java:76, took 7.311041 s
> > 19/09/17 12:57:15 INFO SparkUI: Stopped Spark web UI at
> http://dnnserver2:4040
> > 19/09/17 12:57:15 INFO StandaloneSchedulerBackend: Shutting down all
> executors
> > 19/09/17 12:57:15 INFO CoarseGrainedSchedulerBackend$DriverEndpoint:
> Asking each executor to shut down
> > 19/09/17 12:57:15 INFO MapOutputTrackerMasterEndpoint:
> MapOutputTrackerMasterEndpoint stopped!
> > 19/09/17 12:57:15 INFO MemoryStore: MemoryStore cleared
> > 19/09/17 12:57:15 INFO BlockManager: BlockManager stopped
> > 19/09/17 12:57:15 INFO BlockManagerMaster: BlockManagerMaster stopped
> > 19/09/17 12:57:15 INFO
> OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
> OutputCommitCoordinator stopped!
> > 19/09/17 12:57:15 INFO SparkContext: Successfully stopped SparkContext
> > 19/09/17 12:57:15 ERROR JobInvocation: Error during job invocation
> BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc.
> > java.lang.RuntimeException: org.apache.spark.SparkException: Job aborted
> due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent
> failure: Lost task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0):
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
> >         at
> org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> >         at org.apache.spark.scheduler.Task.run(Task.scala:123)
> >         at
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
> >         at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
> >         at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
> >         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >         at java.lang.Thread.run(Thread.java:748)
> > Caused by:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
> >         at
> org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
> >         at
> org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
> >         ... 54 more
> >
> > Driver stacktrace:
> >         at
> org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:58)
> >         at
> org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:75)
> >         at
> org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:102)
> >         at
> org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:90)
> >         at
> org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:115)
> >         at
> org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:78)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
> >         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >         at java.lang.Thread.run(Thread.java:748)
> > Caused by: org.apache.spark.SparkException: Job aborted due to stage
> failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task
> 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0):
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
> >         at
> org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
> >         at
> org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
> >         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
> >         at
> org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
> >         at
> org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
> >         at
> org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
> >         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
> >         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> >         at
> org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
> >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
> >         at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> >         at org.apache.spark.scheduler.Task.run(Task.scala:123)
> >         at
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
> >         at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
> >         at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
> >         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >         at java.lang.Thread.run(Thread.java:748)
> > Caused by:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
> >         at
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
> >         at
> org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
> >         at
> org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
> >         at
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
> >         at
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
> >         ... 54 more
> >
> > Driver stacktrace:
> >         at org.apache.spark.scheduler.DAGScheduler.org
> $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
> >         at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
> >         at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
> >         at
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> >         at
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
> >         at
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
> >         at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
> >         at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
> >         at scala.Option.foreach(Option.scala:257)
> >         at
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
> >         at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
> >         at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
> >         at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
> >         at
> org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
> >         at
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
> >         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
> >         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
> >         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
> >         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
> >         at
> org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)
> >         at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >         at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> >         at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
> >         at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
> >         at
> org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
> >         at
> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
> >         at
> org.apache.beam.runners.spark.translation.BoundedDataset.getBytes(BoundedDataset.java:76)
> >         at
> org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInput(SparkBatchPortablePipelineTranslator.java:335)
> >         at
> org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translateExecutableStage(SparkBatchPortablePipelineTranslator.java:223)
> >         at
> org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translate(SparkBatchPortablePipelineTranslator.java:137)
> >         at
> org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$1(SparkPipelineRunner.java:97)
> >         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> >         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >         ... 3 more
> > 19/09/17 12:57:15 INFO BeamFileSystemArtifactRetrievalService: Manifest
> at
> /tmp/beam-artifact-staging/job_ce990c45-c56e-401e-aad6-fd72480e1050/MANIFEST
> has 0 artifact locations
> > 19/09/17 12:57:15 INFO BeamFileSystemArtifactStagingService: Removed dir
> /tmp/beam-artifact-staging/job_ce990c45-c56e-401e-aad6-fd72480e1050/
> >
> >
> >
> > On 2019/09/17 03:50:06, Kyle Weaver <kc...@google.com> wrote:
> > > Could you share more of the stack trace?
> > >
> > > Kyle Weaver | Software Engineer | github.com/ibzib |
> kcweaver@google.com
> > >
> > >
> > > On Mon, Sep 16, 2019 at 7:49 PM Benjamin Tan <
> benjamintanweihao@gmail.com>
> > > wrote:
> > >
> > > > I'm trying to use the loopback via the environment_type option:
> > > >
> > > > options = PipelineOptions(["--runner=PortableRunner",
> > > >
> > > >  "--environment_config=-apachebeam/python3.7_sdk ",
> > > >
> "--environment_type=LOOPBACK",
> > > >
> > > >  "--job_endpoint=dnnserver2:8099"])
> > > >
> > > > Previouly, I've done:
> > > >
> > > > ./gradlew -p sdks/python/container buildAll
> > > >
> > > > And ran the Spark job server:
> > > >
> > > > ./gradlew :runners:spark:job-server:runShadow
> > > > -PsparkMasterUrl=spark://dnnserver2:7077
> > > >
> > > > However, I get a pretty cryptic error message:
> > > >
> > > >
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> > > > org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> > > > UNIMPLEMENTED: Method not found!
> > > >
> > > > Any ideas?
> > > >
> > > >
> > > >
> > >
> >
>

Re: How to use the loopback?

Posted by Benjamin Tan <be...@gmail.com>.
Some updates!

I built and install the beam 2.16.0 python library for both Python 2 and Python 3. 

Python 3 leaves me with the same error. 

However, apparently Python 2 works. I can see the generated output. However, I get the following error:

Any ideas? I'm not sure why this is so hard.

/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/__init__.py:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
ERROR:grpc._server:Exception calling application: u'1-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'1-1'
ERROR:grpc._server:Exception calling application: u'2-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'2-1'
ERROR:grpc._server:Exception calling application: u'4-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'4-1'
ERROR:grpc._server:Exception calling application: u'3-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'3-1'
ERROR:grpc._server:Exception calling application: u'6-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'6-1'
ERROR:grpc._server:Exception calling application: u'5-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'5-1'
ERROR:grpc._server:Exception calling application: u'7-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'7-1'
ERROR:grpc._server:Exception calling application: u'8-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'8-1'
ERROR:grpc._server:Exception calling application: u'10-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'10-1'
ERROR:grpc._server:Exception calling application: u'9-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'9-1'
ERROR:grpc._server:Exception calling application: u'12-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'12-1'
ERROR:grpc._server:Exception calling application: u'11-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'11-1'
ERROR:grpc._server:Exception calling application: u'14-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'14-1'
ERROR:grpc._server:Exception calling application: u'13-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'13-1'
ERROR:grpc._server:Exception calling application: u'17-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'17-1'
ERROR:grpc._server:Exception calling application: u'15-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'15-1'
ERROR:grpc._server:Exception calling application: u'16-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'16-1'
ERROR:grpc._server:Exception calling application: u'21-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'21-1'
ERROR:grpc._server:Exception calling application: u'23-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'23-1'
ERROR:grpc._server:Exception calling application: u'19-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'19-1'
ERROR:grpc._server:Exception calling application: u'20-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'20-1'
ERROR:grpc._server:Exception calling application: u'18-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'18-1'
ERROR:grpc._server:Exception calling application: u'22-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'22-1'
ERROR:grpc._server:Exception calling application: u'25-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'25-1'
ERROR:grpc._server:Exception calling application: u'24-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'24-1'
ERROR:grpc._server:Exception calling application: u'27-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'27-1'
ERROR:grpc._server:Exception calling application: u'26-1'
Traceback (most recent call last):
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/grpc/_server.py", line 434, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/home/benjamintan/miniconda3/envs/apbeam-playground/lib/python2.7/site-packages/apache_beam/runners/worker/worker_pool_main.py", line 126, in StopWorker
    worker_process = self._worker_processes.pop(stop_worker_request.worker_id)
KeyError: u'26-1'


On 2019/09/17 04:57:55, Benjamin Tan <be...@gmail.com> wrote: 
> Here you go! 
> 
> builder@dnnserver2:~/beam (release-2.16.0) $ ./gradlew :runners:spark:job-server:runShadow -PsparkMasterUrl=spark://dnnserver2:7077
> Configuration on demand is an incubating feature.
> 
> > Task :runners:spark:job-server:runShadow
> Listening for transport dt_socket at address: 5005
> log4j:WARN No appenders could be found for logger (org.apache.beam.vendor.grpc.v1p21p0.io.netty.util.internal.logging.InternalLoggerFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
> 19/09/17 12:57:06 INFO SparkContext: Running Spark version 2.4.4
> 19/09/17 12:57:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 19/09/17 12:57:06 INFO SparkContext: Submitted application: BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc
> 19/09/17 12:57:06 INFO SecurityManager: Changing view acls to: builder
> 19/09/17 12:57:06 INFO SecurityManager: Changing modify acls to: builder
> 19/09/17 12:57:06 INFO SecurityManager: Changing view acls groups to:
> 19/09/17 12:57:06 INFO SecurityManager: Changing modify acls groups to:
> 19/09/17 12:57:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(builder); groups with view permissions: Set(); users  with modify permissions: Set(builder); groups with modify permissions: Set()
> 19/09/17 12:57:07 INFO Utils: Successfully started service 'sparkDriver' on port 36069.
> 19/09/17 12:57:07 INFO SparkEnv: Registering MapOutputTracker
> 19/09/17 12:57:07 INFO SparkEnv: Registering BlockManagerMaster
> 19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
> 19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
> 19/09/17 12:57:07 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-92f6079e-4a85-4b09-b48b-5d58ddf304a6
> 19/09/17 12:57:07 INFO MemoryStore: MemoryStore started with capacity 1949.1 MB
> 19/09/17 12:57:07 INFO SparkEnv: Registering OutputCommitCoordinator
> 19/09/17 12:57:07 INFO Utils: Successfully started service 'SparkUI' on port 4040.
> 19/09/17 12:57:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://dnnserver2:4040
> 19/09/17 12:57:07 INFO SparkContext: Added JAR /home/builder/beam/runners/spark/job-server/build/install/job-server-shadow/lib/beam-runners-spark-job-server-2.16.0-SNAPSHOT.jar at spark://dnnserver2:36069/jars/beam-runners-spark-job-server-2.16.0-SNAPSHOT.jar with timestamp 1568696227623
> 19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://dnnserver2:7077...
> 19/09/17 12:57:07 INFO TransportClientFactory: Successfully created connection to dnnserver2/10.64.1.208:7077 after 40 ms (0 ms spent in bootstraps)
> 19/09/17 12:57:07 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20190917125707-0066
> 19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20190917125707-0066/0 on worker-20190916143324-10.64.1.208-41823 (10.64.1.208:41823) with 12 core(s)
> 19/09/17 12:57:07 INFO StandaloneSchedulerBackend: Granted executor ID app-20190917125707-0066/0 on hostPort 10.64.1.208:41823 with 12 core(s), 1024.0 MB RAM
> 19/09/17 12:57:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37069.
> 19/09/17 12:57:07 INFO NettyBlockTransferService: Server created on dnnserver2:37069
> 19/09/17 12:57:07 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
> 19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190917125707-0066/0 is now RUNNING
> 19/09/17 12:57:07 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, dnnserver2, 37069, None)
> 19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: Registering block manager dnnserver2:37069 with 1949.1 MB RAM, BlockManagerId(driver, dnnserver2, 37069, None)
> 19/09/17 12:57:07 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, dnnserver2, 37069, None)
> 19/09/17 12:57:07 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, dnnserver2, 37069, None)
> 19/09/17 12:57:07 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
> 19/09/17 12:57:07 INFO SparkPipelineRunner: Running job BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc on Spark master spark://dnnserver2:7077
> 19/09/17 12:57:07 INFO AggregatorsAccumulator: Instantiated aggregators accumulator:
> 19/09/17 12:57:08 INFO MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
> 19/09/17 12:57:08 WARN GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
> 19/09/17 12:57:08 INFO SparkContext: Starting job: collect at BoundedDataset.java:76
> 19/09/17 12:57:08 INFO DAGScheduler: Got job 0 (collect at BoundedDataset.java:76) with 2 output partitions
> 19/09/17 12:57:08 INFO DAGScheduler: Final stage: ResultStage 0 (collect at BoundedDataset.java:76)
> 19/09/17 12:57:08 INFO DAGScheduler: Parents of final stage: List()
> 19/09/17 12:57:08 INFO DAGScheduler: Missing parents: List()
> 19/09/17 12:57:08 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[24] at map at BoundedDataset.java:75), which has no missing parents
> 19/09/17 12:57:08 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 29.3 KB, free 1949.1 MB)
> 19/09/17 12:57:08 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 11.0 KB, free 1949.1 MB)
> 19/09/17 12:57:08 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on dnnserver2:37069 (size: 11.0 KB, free: 1949.1 MB)
> 19/09/17 12:57:08 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
> 19/09/17 12:57:08 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[24] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1))
> 19/09/17 12:57:08 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
> 19/09/17 12:57:10 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.64.1.208:41406) with ID 0
> 19/09/17 12:57:10 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> 19/09/17 12:57:10 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> 19/09/17 12:57:10 INFO BlockManagerMasterEndpoint: Registering block manager 10.64.1.208:43075 with 366.3 MB RAM, BlockManagerId(0, 10.64.1.208, 43075, None)
> 19/09/17 12:57:12 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.64.1.208:43075 (size: 11.0 KB, free: 366.3 MB)
> 19/09/17 12:57:15 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
>         at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
>         at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>         at org.apache.spark.scheduler.Task.run(Task.scala:123)
>         at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>         at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
>         at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
>         at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
>         ... 54 more
> 
> 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 1]
> 19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> 19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 3, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 2]
> 19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 4, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> 19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 3]
> 19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 5, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> 19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 5) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 4]
> 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 5]
> 19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 6, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
> 19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
> 19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 7) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 6]
> 19/09/17 12:57:15 ERROR TaskSetManager: Task 1 in stage 0.0 failed 4 times; aborting job
> 19/09/17 12:57:15 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
> 19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 6) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 7]
> 19/09/17 12:57:15 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
> 19/09/17 12:57:15 INFO TaskSchedulerImpl: Cancelling stage 0
> 19/09/17 12:57:15 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage cancelled
> 19/09/17 12:57:15 INFO DAGScheduler: ResultStage 0 (collect at BoundedDataset.java:76) failed in 7.248 s due to Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
>         at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
>         at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>         at org.apache.spark.scheduler.Task.run(Task.scala:123)
>         at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>         at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
>         at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
>         at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
>         ... 54 more
> 
> Driver stacktrace:
> 19/09/17 12:57:15 INFO DAGScheduler: Job 0 failed: collect at BoundedDataset.java:76, took 7.311041 s
> 19/09/17 12:57:15 INFO SparkUI: Stopped Spark web UI at http://dnnserver2:4040
> 19/09/17 12:57:15 INFO StandaloneSchedulerBackend: Shutting down all executors
> 19/09/17 12:57:15 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
> 19/09/17 12:57:15 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
> 19/09/17 12:57:15 INFO MemoryStore: MemoryStore cleared
> 19/09/17 12:57:15 INFO BlockManager: BlockManager stopped
> 19/09/17 12:57:15 INFO BlockManagerMaster: BlockManagerMaster stopped
> 19/09/17 12:57:15 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
> 19/09/17 12:57:15 INFO SparkContext: Successfully stopped SparkContext
> 19/09/17 12:57:15 ERROR JobInvocation: Error during job invocation BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc.
> java.lang.RuntimeException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
>         at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
>         at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>         at org.apache.spark.scheduler.Task.run(Task.scala:123)
>         at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>         at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
>         at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
>         at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
>         ... 54 more
> 
> Driver stacktrace:
>         at org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:58)
>         at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:75)
>         at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:102)
>         at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:90)
>         at org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:115)
>         at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:78)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
>         at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
>         at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
>         at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
>         at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
>         at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
>         at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
>         at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
>         at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
>         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>         at org.apache.spark.scheduler.Task.run(Task.scala:123)
>         at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>         at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
>         at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
>         at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
>         at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
>         at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
>         at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
>         ... 54 more
> 
> Driver stacktrace:
>         at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
>         at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
>         at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
>         at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
>         at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
>         at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
>         at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
>         at scala.Option.foreach(Option.scala:257)
>         at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
>         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
>         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
>         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
>         at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
>         at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
>         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
>         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
>         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
>         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
>         at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)
>         at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>         at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>         at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
>         at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
>         at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
>         at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
>         at org.apache.beam.runners.spark.translation.BoundedDataset.getBytes(BoundedDataset.java:76)
>         at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInput(SparkBatchPortablePipelineTranslator.java:335)
>         at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translateExecutableStage(SparkBatchPortablePipelineTranslator.java:223)
>         at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translate(SparkBatchPortablePipelineTranslator.java:137)
>         at org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$1(SparkPipelineRunner.java:97)
>         at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>         ... 3 more
> 19/09/17 12:57:15 INFO BeamFileSystemArtifactRetrievalService: Manifest at /tmp/beam-artifact-staging/job_ce990c45-c56e-401e-aad6-fd72480e1050/MANIFEST has 0 artifact locations
> 19/09/17 12:57:15 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-artifact-staging/job_ce990c45-c56e-401e-aad6-fd72480e1050/
> 
> 
> 
> On 2019/09/17 03:50:06, Kyle Weaver <kc...@google.com> wrote: 
> > Could you share more of the stack trace?
> > 
> > Kyle Weaver | Software Engineer | github.com/ibzib | kcweaver@google.com
> > 
> > 
> > On Mon, Sep 16, 2019 at 7:49 PM Benjamin Tan <be...@gmail.com>
> > wrote:
> > 
> > > I'm trying to use the loopback via the environment_type option:
> > >
> > > options = PipelineOptions(["--runner=PortableRunner",
> > >
> > >  "--environment_config=-apachebeam/python3.7_sdk ",
> > >                                            "--environment_type=LOOPBACK",
> > >
> > >  "--job_endpoint=dnnserver2:8099"])
> > >
> > > Previouly, I've done:
> > >
> > > ./gradlew -p sdks/python/container buildAll
> > >
> > > And ran the Spark job server:
> > >
> > > ./gradlew :runners:spark:job-server:runShadow
> > > -PsparkMasterUrl=spark://dnnserver2:7077
> > >
> > > However, I get a pretty cryptic error message:
> > >
> > > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> > > org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> > > UNIMPLEMENTED: Method not found!
> > >
> > > Any ideas?
> > >
> > >
> > >
> > 
> 

Re: How to use the loopback?

Posted by Benjamin Tan <be...@gmail.com>.
Here you go! 

builder@dnnserver2:~/beam (release-2.16.0) $ ./gradlew :runners:spark:job-server:runShadow -PsparkMasterUrl=spark://dnnserver2:7077
Configuration on demand is an incubating feature.

> Task :runners:spark:job-server:runShadow
Listening for transport dt_socket at address: 5005
log4j:WARN No appenders could be found for logger (org.apache.beam.vendor.grpc.v1p21p0.io.netty.util.internal.logging.InternalLoggerFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/09/17 12:57:06 INFO SparkContext: Running Spark version 2.4.4
19/09/17 12:57:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/17 12:57:06 INFO SparkContext: Submitted application: BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc
19/09/17 12:57:06 INFO SecurityManager: Changing view acls to: builder
19/09/17 12:57:06 INFO SecurityManager: Changing modify acls to: builder
19/09/17 12:57:06 INFO SecurityManager: Changing view acls groups to:
19/09/17 12:57:06 INFO SecurityManager: Changing modify acls groups to:
19/09/17 12:57:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(builder); groups with view permissions: Set(); users  with modify permissions: Set(builder); groups with modify permissions: Set()
19/09/17 12:57:07 INFO Utils: Successfully started service 'sparkDriver' on port 36069.
19/09/17 12:57:07 INFO SparkEnv: Registering MapOutputTracker
19/09/17 12:57:07 INFO SparkEnv: Registering BlockManagerMaster
19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/09/17 12:57:07 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-92f6079e-4a85-4b09-b48b-5d58ddf304a6
19/09/17 12:57:07 INFO MemoryStore: MemoryStore started with capacity 1949.1 MB
19/09/17 12:57:07 INFO SparkEnv: Registering OutputCommitCoordinator
19/09/17 12:57:07 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/09/17 12:57:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://dnnserver2:4040
19/09/17 12:57:07 INFO SparkContext: Added JAR /home/builder/beam/runners/spark/job-server/build/install/job-server-shadow/lib/beam-runners-spark-job-server-2.16.0-SNAPSHOT.jar at spark://dnnserver2:36069/jars/beam-runners-spark-job-server-2.16.0-SNAPSHOT.jar with timestamp 1568696227623
19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://dnnserver2:7077...
19/09/17 12:57:07 INFO TransportClientFactory: Successfully created connection to dnnserver2/10.64.1.208:7077 after 40 ms (0 ms spent in bootstraps)
19/09/17 12:57:07 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20190917125707-0066
19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20190917125707-0066/0 on worker-20190916143324-10.64.1.208-41823 (10.64.1.208:41823) with 12 core(s)
19/09/17 12:57:07 INFO StandaloneSchedulerBackend: Granted executor ID app-20190917125707-0066/0 on hostPort 10.64.1.208:41823 with 12 core(s), 1024.0 MB RAM
19/09/17 12:57:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37069.
19/09/17 12:57:07 INFO NettyBlockTransferService: Server created on dnnserver2:37069
19/09/17 12:57:07 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/09/17 12:57:07 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190917125707-0066/0 is now RUNNING
19/09/17 12:57:07 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, dnnserver2, 37069, None)
19/09/17 12:57:07 INFO BlockManagerMasterEndpoint: Registering block manager dnnserver2:37069 with 1949.1 MB RAM, BlockManagerId(driver, dnnserver2, 37069, None)
19/09/17 12:57:07 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, dnnserver2, 37069, None)
19/09/17 12:57:07 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, dnnserver2, 37069, None)
19/09/17 12:57:07 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
19/09/17 12:57:07 INFO SparkPipelineRunner: Running job BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc on Spark master spark://dnnserver2:7077
19/09/17 12:57:07 INFO AggregatorsAccumulator: Instantiated aggregators accumulator:
19/09/17 12:57:08 INFO MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
19/09/17 12:57:08 WARN GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/09/17 12:57:08 INFO SparkContext: Starting job: collect at BoundedDataset.java:76
19/09/17 12:57:08 INFO DAGScheduler: Got job 0 (collect at BoundedDataset.java:76) with 2 output partitions
19/09/17 12:57:08 INFO DAGScheduler: Final stage: ResultStage 0 (collect at BoundedDataset.java:76)
19/09/17 12:57:08 INFO DAGScheduler: Parents of final stage: List()
19/09/17 12:57:08 INFO DAGScheduler: Missing parents: List()
19/09/17 12:57:08 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[24] at map at BoundedDataset.java:75), which has no missing parents
19/09/17 12:57:08 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 29.3 KB, free 1949.1 MB)
19/09/17 12:57:08 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 11.0 KB, free 1949.1 MB)
19/09/17 12:57:08 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on dnnserver2:37069 (size: 11.0 KB, free: 1949.1 MB)
19/09/17 12:57:08 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
19/09/17 12:57:08 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[24] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1))
19/09/17 12:57:08 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
19/09/17 12:57:10 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.64.1.208:41406) with ID 0
19/09/17 12:57:10 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
19/09/17 12:57:10 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
19/09/17 12:57:10 INFO BlockManagerMasterEndpoint: Registering block manager 10.64.1.208:43075 with 366.3 MB RAM, BlockManagerId(0, 10.64.1.208, 43075, None)
19/09/17 12:57:12 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.64.1.208:43075 (size: 11.0 KB, free: 366.3 MB)
19/09/17 12:57:15 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
        at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
        at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
        at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
        at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        ... 54 more

19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 1]
19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 3, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 2]
19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 4, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 3]
19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 5, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 5) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 4]
19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 5]
19/09/17 12:57:15 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 6, 10.64.1.208, executor 0, partition 0, PROCESS_LOCAL, 7872 bytes)
19/09/17 12:57:15 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0, partition 1, PROCESS_LOCAL, 7883 bytes)
19/09/17 12:57:15 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 7) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 6]
19/09/17 12:57:15 ERROR TaskSetManager: Task 1 in stage 0.0 failed 4 times; aborting job
19/09/17 12:57:15 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
19/09/17 12:57:15 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 6) on 10.64.1.208, executor 0: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException (org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!) [duplicate 7]
19/09/17 12:57:15 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
19/09/17 12:57:15 INFO TaskSchedulerImpl: Cancelling stage 0
19/09/17 12:57:15 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage cancelled
19/09/17 12:57:15 INFO DAGScheduler: ResultStage 0 (collect at BoundedDataset.java:76) failed in 7.248 s due to Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
        at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
        at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
        at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
        at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        ... 54 more

Driver stacktrace:
19/09/17 12:57:15 INFO DAGScheduler: Job 0 failed: collect at BoundedDataset.java:76, took 7.311041 s
19/09/17 12:57:15 INFO SparkUI: Stopped Spark web UI at http://dnnserver2:4040
19/09/17 12:57:15 INFO StandaloneSchedulerBackend: Shutting down all executors
19/09/17 12:57:15 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
19/09/17 12:57:15 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/09/17 12:57:15 INFO MemoryStore: MemoryStore cleared
19/09/17 12:57:15 INFO BlockManager: BlockManager stopped
19/09/17 12:57:15 INFO BlockManagerMaster: BlockManagerMaster stopped
19/09/17 12:57:15 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/09/17 12:57:15 INFO SparkContext: Successfully stopped SparkContext
19/09/17 12:57:15 ERROR JobInvocation: Error during job invocation BeamApp-builder-0917045705-aa3391a3_96f998c3-3c0d-42f1-885c-f1e2538310bc.
java.lang.RuntimeException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
        at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
        at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
        at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
        at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        ... 54 more

Driver stacktrace:
        at org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:58)
        at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:75)
        at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:102)
        at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:90)
        at org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:115)
        at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:78)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 7, 10.64.1.208, executor 0): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
        at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
        at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:125)
        at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
        at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)
        at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
        at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
        at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
        at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: UNIMPLEMENTED: Method not found!
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:235)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:216)
        at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
        at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.startWorker(BeamFnExternalWorkerPoolGrpc.java:226)
        at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:113)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
        at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        ... 54 more

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
        at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
        at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
        at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
        at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
        at org.apache.beam.runners.spark.translation.BoundedDataset.getBytes(BoundedDataset.java:76)
        at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInput(SparkBatchPortablePipelineTranslator.java:335)
        at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translateExecutableStage(SparkBatchPortablePipelineTranslator.java:223)
        at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translate(SparkBatchPortablePipelineTranslator.java:137)
        at org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$1(SparkPipelineRunner.java:97)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        ... 3 more
19/09/17 12:57:15 INFO BeamFileSystemArtifactRetrievalService: Manifest at /tmp/beam-artifact-staging/job_ce990c45-c56e-401e-aad6-fd72480e1050/MANIFEST has 0 artifact locations
19/09/17 12:57:15 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-artifact-staging/job_ce990c45-c56e-401e-aad6-fd72480e1050/



On 2019/09/17 03:50:06, Kyle Weaver <kc...@google.com> wrote: 
> Could you share more of the stack trace?
> 
> Kyle Weaver | Software Engineer | github.com/ibzib | kcweaver@google.com
> 
> 
> On Mon, Sep 16, 2019 at 7:49 PM Benjamin Tan <be...@gmail.com>
> wrote:
> 
> > I'm trying to use the loopback via the environment_type option:
> >
> > options = PipelineOptions(["--runner=PortableRunner",
> >
> >  "--environment_config=-apachebeam/python3.7_sdk ",
> >                                            "--environment_type=LOOPBACK",
> >
> >  "--job_endpoint=dnnserver2:8099"])
> >
> > Previouly, I've done:
> >
> > ./gradlew -p sdks/python/container buildAll
> >
> > And ran the Spark job server:
> >
> > ./gradlew :runners:spark:job-server:runShadow
> > -PsparkMasterUrl=spark://dnnserver2:7077
> >
> > However, I get a pretty cryptic error message:
> >
> > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> > org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> > UNIMPLEMENTED: Method not found!
> >
> > Any ideas?
> >
> >
> >
> 

Re: How to use the loopback?

Posted by Kyle Weaver <kc...@google.com>.
Could you share more of the stack trace?

Kyle Weaver | Software Engineer | github.com/ibzib | kcweaver@google.com


On Mon, Sep 16, 2019 at 7:49 PM Benjamin Tan <be...@gmail.com>
wrote:

> I'm trying to use the loopback via the environment_type option:
>
> options = PipelineOptions(["--runner=PortableRunner",
>
>  "--environment_config=-apachebeam/python3.7_sdk ",
>                                            "--environment_type=LOOPBACK",
>
>  "--job_endpoint=dnnserver2:8099"])
>
> Previouly, I've done:
>
> ./gradlew -p sdks/python/container buildAll
>
> And ran the Spark job server:
>
> ./gradlew :runners:spark:job-server:runShadow
> -PsparkMasterUrl=spark://dnnserver2:7077
>
> However, I get a pretty cryptic error message:
>
> org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
> org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException:
> UNIMPLEMENTED: Method not found!
>
> Any ideas?
>
>
>