You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Daniel Haviv <da...@veracity-group.com> on 2015/08/09 11:29:55 UTC

Starting a service with Spark Executors

Hi,
I'd like to start a service with each Spark Executor upon initalization and
have the disributed code reference that service locally.
What I'm trying to do is to invoke torch7 computations without reloading
the model for each row by starting Lua http handler that will recieve http
requests for each row in my data.

Can this be achieved with Spark ?

Thank you.
Daniel

Re: Starting a service with Spark Executors

Posted by Koert Kuipers <ko...@tresata.com>.
starting is easy, just use a lazy val. stopping is harder. i do not think
executors have a cleanup hook currently...

On Sun, Aug 9, 2015 at 5:29 AM, Daniel Haviv <
daniel.haviv@veracity-group.com> wrote:

> Hi,
> I'd like to start a service with each Spark Executor upon initalization
> and have the disributed code reference that service locally.
> What I'm trying to do is to invoke torch7 computations without reloading
> the model for each row by starting Lua http handler that will recieve http
> requests for each row in my data.
>
> Can this be achieved with Spark ?
>
> Thank you.
> Daniel
>