You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Ankur Sethi <as...@i411.com> on 2007/07/17 18:24:34 UTC

adding datanodes on the fly?

How are datanodes added?  Do they get added and started only at start of DFS
filesystem?  Can they be added while hadoop fs is running by editing slaves
file or does hadoop have to be restarted?

 

 

Ankur Sethi

Systems Engineer

i411, Inc

(703) 793-3270 ex 220

 


RE: adding datanodes on the fly?

Posted by Ankur Sethi <as...@i411.com>.
Well I do have it running in a rudimentary way now with the help of the forum
users.  I will try to put some things I learned somewhere on the wiki.
Either in the main FAQ or some kind of beginner's guide.

Re: adding datanodes on the fly?

Posted by Briggs <ac...@gmail.com>.
That line should be at the end of every resolved post!

On 7/17/07, Ted Dunning <td...@veoh.com> wrote:
>
> And to post your experiences on the wiki where you would expect somebody
> like yourself to have looked for what you now know.

-- 
"Conscious decisions by conscious minds are what make reality real"

Re: adding datanodes on the fly?

Posted by Ted Dunning <td...@veoh.com>.
And to post your experiences on the wiki where you would expect somebody
like yourself to have looked for what you now know.


On 7/17/07 10:49 AM, "Raghu Angadi" <ra...@yahoo-inc.com> wrote:

> You are strongly encouraged to experiment with the scripts to see what
> they do.


Re: adding datanodes on the fly?

Posted by Raghu Angadi <ra...@yahoo-inc.com>.
You can take a look at start-dfs.sh to see what it does.

Pretty much : $ ssh datanode 'cd dir; bin/hadoop-daemon.sh start datanode'

You are strongly encouraged to experiment with the scripts to see what 
they do. When something does not seem to work well, check corresponding 
log file in logs/ directory as well.

> The documentation says to start DFS from the namenode which will startup all
> the datanodes.

This is for the simple, common case.

Raghu.

> Thanks,
> Ankur
> 
> -----Original Message-----
> From: Raghu Angadi [mailto:rangadi@yahoo-inc.com] 
> Sent: Tuesday, July 17, 2007 1:33 PM
> To: hadoop-user@lucene.apache.org
> Subject: Re: adding datanodes on the fly?
> 
> Ankur Sethi wrote:
>> How are datanodes added?  Do they get added and started only at start of
> DFS
>> filesystem?  Can they be added while hadoop fs is running by editing slaves
>> file or does hadoop have to be restarted?
> 
> to add more data nodes, you can just bring up new datanodes with the 
> right config anytime. Namenode can add them anytime.
> 
> 'slaves' file is used only by the scripts like bin/start-dfs.sh, 
> bin/stop-dfs.sh etc. So adding new datanodes to slaves helps you manage 
> restarts etc easier.
> 
> Raghu.


Re: adding datanodes on the fly?

Posted by Brian Harrington <br...@yahoo-inc.com>.
On the datanode you can run:

bin/hadoop-daemon.sh --config <config_path> start datanode

Brian

Ankur Sethi wrote:
> Thanks for the reply.  But how do you start a datanode?  
> 
> The documentation says to start DFS from the namenode which will startup all
> the datanodes.
> 
> Thanks,
> Ankur
> 
> -----Original Message-----
> From: Raghu Angadi [mailto:rangadi@yahoo-inc.com] 
> Sent: Tuesday, July 17, 2007 1:33 PM
> To: hadoop-user@lucene.apache.org
> Subject: Re: adding datanodes on the fly?
> 
> Ankur Sethi wrote:
>> How are datanodes added?  Do they get added and started only at start of
> DFS
>> filesystem?  Can they be added while hadoop fs is running by editing slaves
>> file or does hadoop have to be restarted?
> 
> to add more data nodes, you can just bring up new datanodes with the 
> right config anytime. Namenode can add them anytime.
> 
> 'slaves' file is used only by the scripts like bin/start-dfs.sh, 
> bin/stop-dfs.sh etc. So adding new datanodes to slaves helps you manage 
> restarts etc easier.
> 
> Raghu.


RE: adding datanodes on the fly?

Posted by Ankur Sethi <as...@i411.com>.
Thanks for the reply.  But how do you start a datanode?  

The documentation says to start DFS from the namenode which will startup all
the datanodes.

Thanks,
Ankur

-----Original Message-----
From: Raghu Angadi [mailto:rangadi@yahoo-inc.com] 
Sent: Tuesday, July 17, 2007 1:33 PM
To: hadoop-user@lucene.apache.org
Subject: Re: adding datanodes on the fly?

Ankur Sethi wrote:
> How are datanodes added?  Do they get added and started only at start of
DFS
> filesystem?  Can they be added while hadoop fs is running by editing slaves
> file or does hadoop have to be restarted?

to add more data nodes, you can just bring up new datanodes with the 
right config anytime. Namenode can add them anytime.

'slaves' file is used only by the scripts like bin/start-dfs.sh, 
bin/stop-dfs.sh etc. So adding new datanodes to slaves helps you manage 
restarts etc easier.

Raghu.

Re: adding datanodes on the fly?

Posted by Raghu Angadi <ra...@yahoo-inc.com>.
Ankur Sethi wrote:
> How are datanodes added?  Do they get added and started only at start of DFS
> filesystem?  Can they be added while hadoop fs is running by editing slaves
> file or does hadoop have to be restarted?

to add more data nodes, you can just bring up new datanodes with the 
right config anytime. Namenode can add them anytime.

'slaves' file is used only by the scripts like bin/start-dfs.sh, 
bin/stop-dfs.sh etc. So adding new datanodes to slaves helps you manage 
restarts etc easier.

Raghu.