You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Francois Berenger <Fr...@lri.fr> on 2008/09/25 11:09:05 UTC
adding nodes while computing
Hello,
Is this possible to add slaves with IP address not known in advance
to an Hadoop cluster while a computation is going on?
And the reverse capability: is it possible to cleanly permanently
remove a slave node from the hadoop cluster?
Thank you,
François.
Re: adding nodes while computing
Posted by Billy Pearson <sa...@pearsonwholesale.com>.
You should be able to add nodes to the cluster while jobs are running the
jobtracker should start assigning task to the tasktrackers and dfs should
start using the nodes for storage
But map data files are stored on the slaves and copied to the reduce task so
if a node goes down during a MR job then the maps will have to be ran again.
Billy
"Francois Berenger" <Fr...@lri.fr> wrote
in message news:48DB5531.2060807@lri.fr...
> Hello,
>
> Is this possible to add slaves with IP address not known in advance
> to an Hadoop cluster while a computation is going on?
>
> And the reverse capability: is it possible to cleanly permanently
> remove a slave node from the hadoop cluster?
>
> Thank you,
> François.
>