You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Manoj Samel <ma...@gmail.com> on 2014/01/16 05:32:30 UTC

Master and worker nodes in standalone deployment

When spark is deployed on cluster in standalone deployment mode (V 0.81),
one of the node is started as master and others as workers.

What does the master node does ? Can it participates in actual computations
or does it just acts as coordinator ?

Thanks,

Manoj

Re: Master and worker nodes in standalone deployment

Posted by Nan Zhu <zh...@gmail.com>.
it maintains the running of worker process, create executor for the tasks in the worker nodes, contacts with driver program, etc.

-- 
Nan Zhu


On Wednesday, January 15, 2014 at 11:37 PM, Manoj Samel wrote:

> Thanks,
> 
> Could you still explain what does master process does ?
> 
> 
> On Wed, Jan 15, 2014 at 8:36 PM, Nan Zhu <zhunanmcgill@gmail.com (mailto:zhunanmcgill@gmail.com)> wrote:
> > you can start a worker process in the master node
> > 
> > so that all nodes in your cluster can participate in the computation
> > 
> > Best, 
> > 
> > -- 
> > Nan Zhu
> > 
> > 
> > On Wednesday, January 15, 2014 at 11:32 PM, Manoj Samel wrote:
> > 
> > > When spark is deployed on cluster in standalone deployment mode (V 0.81), one of the node is started as master and others as workers.
> > > 
> > > What does the master node does ? Can it participates in actual computations or does it just acts as coordinator ? 
> > > 
> > > Thanks,
> > > 
> > > Manoj 
> > 
> 


Re: Master and worker nodes in standalone deployment

Posted by Manoj Samel <ma...@gmail.com>.
Thanks,

Could you still explain what does master process does ?


On Wed, Jan 15, 2014 at 8:36 PM, Nan Zhu <zh...@gmail.com> wrote:

> you can start a worker process in the master node
>
> so that all nodes in your cluster can participate in the computation
>
> Best,
>
> --
> Nan Zhu
>
> On Wednesday, January 15, 2014 at 11:32 PM, Manoj Samel wrote:
>
> When spark is deployed on cluster in standalone deployment mode (V 0.81),
> one of the node is started as master and others as workers.
>
> What does the master node does ? Can it participates in actual
> computations or does it just acts as coordinator ?
>
> Thanks,
>
> Manoj
>
>
>

Re: Master and worker nodes in standalone deployment

Posted by Nan Zhu <zh...@gmail.com>.
you can start a worker process in the master node

so that all nodes in your cluster can participate in the computation

Best, 

-- 
Nan Zhu


On Wednesday, January 15, 2014 at 11:32 PM, Manoj Samel wrote:

> When spark is deployed on cluster in standalone deployment mode (V 0.81), one of the node is started as master and others as workers.
> 
> What does the master node does ? Can it participates in actual computations or does it just acts as coordinator ? 
> 
> Thanks,
> 
> Manoj