You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by EdwardKing <zh...@neusoft.com> on 2014/02/17 02:03:56 UTC

Start hadoop service

I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver





---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------

RE: Start hadoop service

Posted by Brahma Reddy Battula <br...@huawei.com>.
You need to configure slave's IP's in $HADOOP_HOME/etc/hadoop/slaves file..

Hence edit this file configure the slave machine IP and start the cluster using start-dfs in master machine..


By default $HADOOP_HOME/etc/hadoop/slaves will be having the entry as localhost hence one DN is started in master machine..




Thanks & Regards
Brahma Reddy Battula




________________________________________
From: EdwardKing [zhangsc@neusoft.com]
Sent: Monday, February 17, 2014 7:04 AM
To: user@hadoop.apache.org
Subject: Re: Start hadoop service

My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message -----
From: Anand Mundada
To: user@hadoop.apache.org
Cc: <us...@hadoop.apache.org>
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service


No.
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------

RE: Start hadoop service

Posted by Brahma Reddy Battula <br...@huawei.com>.
You need to configure slave's IP's in $HADOOP_HOME/etc/hadoop/slaves file..

Hence edit this file configure the slave machine IP and start the cluster using start-dfs in master machine..


By default $HADOOP_HOME/etc/hadoop/slaves will be having the entry as localhost hence one DN is started in master machine..




Thanks & Regards
Brahma Reddy Battula




________________________________________
From: EdwardKing [zhangsc@neusoft.com]
Sent: Monday, February 17, 2014 7:04 AM
To: user@hadoop.apache.org
Subject: Re: Start hadoop service

My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message -----
From: Anand Mundada
To: user@hadoop.apache.org
Cc: <us...@hadoop.apache.org>
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service


No.
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------

RE: Start hadoop service

Posted by Brahma Reddy Battula <br...@huawei.com>.
You need to configure slave's IP's in $HADOOP_HOME/etc/hadoop/slaves file..

Hence edit this file configure the slave machine IP and start the cluster using start-dfs in master machine..


By default $HADOOP_HOME/etc/hadoop/slaves will be having the entry as localhost hence one DN is started in master machine..




Thanks & Regards
Brahma Reddy Battula




________________________________________
From: EdwardKing [zhangsc@neusoft.com]
Sent: Monday, February 17, 2014 7:04 AM
To: user@hadoop.apache.org
Subject: Re: Start hadoop service

My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message -----
From: Anand Mundada
To: user@hadoop.apache.org
Cc: <us...@hadoop.apache.org>
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service


No.
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------

RE: Start hadoop service

Posted by Brahma Reddy Battula <br...@huawei.com>.
You need to configure slave's IP's in $HADOOP_HOME/etc/hadoop/slaves file..

Hence edit this file configure the slave machine IP and start the cluster using start-dfs in master machine..


By default $HADOOP_HOME/etc/hadoop/slaves will be having the entry as localhost hence one DN is started in master machine..




Thanks & Regards
Brahma Reddy Battula




________________________________________
From: EdwardKing [zhangsc@neusoft.com]
Sent: Monday, February 17, 2014 7:04 AM
To: user@hadoop.apache.org
Subject: Re: Start hadoop service

My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message -----
From: Anand Mundada
To: user@hadoop.apache.org
Cc: <us...@hadoop.apache.org>
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service


No.
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s)
is intended only for the use of the intended recipient and may be confidential and/or privileged of
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying
is strictly prohibited, and may be unlawful.If you have received this communication in error,please
immediately notify the sender by return e-mail, and delete the original message and all copies from
your system. Thank you.
---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by EdwardKing <zh...@neusoft.com>.
My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message ----- 
From: Anand Mundada 
To: user@hadoop.apache.org 
Cc: <us...@hadoop.apache.org> 
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service 


No. 
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by EdwardKing <zh...@neusoft.com>.
My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message ----- 
From: Anand Mundada 
To: user@hadoop.apache.org 
Cc: <us...@hadoop.apache.org> 
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service 


No. 
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by EdwardKing <zh...@neusoft.com>.
My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message ----- 
From: Anand Mundada 
To: user@hadoop.apache.org 
Cc: <us...@hadoop.apache.org> 
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service 


No. 
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by EdwardKing <zh...@neusoft.com>.
My OS is CenterOS 2.6.18, master IP is 172.11.12.6 and slave IP is 172.11.12.7
First I start hadoop service under master,like follows:
[hadoop@master ~]$ cd /home/software/hadoop-2.2.0/sbin
[hadoop@master ~]$./start-dfs.sh
14/02/16 17:24:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-datanode-master.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/software/hadoop-2.2.0/logs/hadoop-hadoop-secondarynamenode-master.out
14/02/16 17:24:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-resourcemanager-master.out
master: starting nodemanager, logging to /home/software/hadoop-2.2.0/logs/yarn-hadoop-nodemanager-master.out

[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /home/software/hadoop-2.2.0/logs/mapred-hadoop-historyserver-master.out

[hadoop@master ~]$jps
4439 DataNode
5400 Jps
4884 NodeManager
4331 NameNode
5342 JobHistoryServer
4595 SecondaryNameNode

Then I open firefox to view hadoop service,like follows:
http://172.11.12.6:9002/dfshealth.jsp
NameNode 'master:9000' (active)
......
Live Nodes  : 1 (Decommissioned: 0)


http://172.11.12.6:8088/cluster
Firefox can't establish a connection to the server at 172.11.12.6:8088.

http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 1
master 172.11.12.6:50010 ......

If I start hadoop service under slave, then I open firefox to view hadoop service correct,like follows:
http://172.11.12.6:9002/dfsnodelist.jsp?whatNodes=LIVE
Live Datanodes : 2
master 172.11.12.6:50010 ......
slave  172.11.12.7:........

Why need I start hadoop service again under slave? Which configuration file is wrong? Thanks



----- Original Message ----- 
From: Anand Mundada 
To: user@hadoop.apache.org 
Cc: <us...@hadoop.apache.org> 
Sent: Monday, February 17, 2014 9:12 AM
Subject: Re: Start hadoop service 


No. 
You don't need to.
Master will start all required daemons on slave.


Check all daemons using jps command.

Sent from my iPhone

On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:


I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
[hadoop@master ~]$./start-dfs.sh
[hadoop@master ~]$./start-yarn.sh
[hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver

My question is whether I need start hadoop service under slave machine again? Thanks.
[hadoop@slave ~]$./start-dfs.sh
[hadoop@slave ~]$./start-yarn.sh
[hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver






---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
immediately notify the sender by return e-mail, and delete the original message and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by Anand Mundada <an...@ymail.com>.
No. 
You don't need to.
Master will start all required daemons on slave.

Check all daemons using jps command.

Sent from my iPhone

> On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:
> 
> I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
> [hadoop@master ~]$./start-dfs.sh
> [hadoop@master ~]$./start-yarn.sh
> [hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
>  
> My question is whether I need start hadoop service under slave machine again? Thanks.
> [hadoop@slave ~]$./start-dfs.sh
> [hadoop@slave ~]$./start-yarn.sh
> [hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver
>  
>  
>  
>  
>  
>  
> ---------------------------------------------------------------------------------------------------
> Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
> is intended only for the use of the intended recipient and may be confidential and/or privileged of 
> Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
> not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
> is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
> immediately notify the sender by return e-mail, and delete the original message and all copies from 
> your system. Thank you. 
> ---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by Anand Mundada <an...@ymail.com>.
No. 
You don't need to.
Master will start all required daemons on slave.

Check all daemons using jps command.

Sent from my iPhone

> On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:
> 
> I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
> [hadoop@master ~]$./start-dfs.sh
> [hadoop@master ~]$./start-yarn.sh
> [hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
>  
> My question is whether I need start hadoop service under slave machine again? Thanks.
> [hadoop@slave ~]$./start-dfs.sh
> [hadoop@slave ~]$./start-yarn.sh
> [hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver
>  
>  
>  
>  
>  
>  
> ---------------------------------------------------------------------------------------------------
> Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
> is intended only for the use of the intended recipient and may be confidential and/or privileged of 
> Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
> not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
> is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
> immediately notify the sender by return e-mail, and delete the original message and all copies from 
> your system. Thank you. 
> ---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by Anand Mundada <an...@ymail.com>.
No. 
You don't need to.
Master will start all required daemons on slave.

Check all daemons using jps command.

Sent from my iPhone

> On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:
> 
> I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
> [hadoop@master ~]$./start-dfs.sh
> [hadoop@master ~]$./start-yarn.sh
> [hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
>  
> My question is whether I need start hadoop service under slave machine again? Thanks.
> [hadoop@slave ~]$./start-dfs.sh
> [hadoop@slave ~]$./start-yarn.sh
> [hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver
>  
>  
>  
>  
>  
>  
> ---------------------------------------------------------------------------------------------------
> Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
> is intended only for the use of the intended recipient and may be confidential and/or privileged of 
> Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
> not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
> is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
> immediately notify the sender by return e-mail, and delete the original message and all copies from 
> your system. Thank you. 
> ---------------------------------------------------------------------------------------------------

Re: Start hadoop service

Posted by Anand Mundada <an...@ymail.com>.
No. 
You don't need to.
Master will start all required daemons on slave.

Check all daemons using jps command.

Sent from my iPhone

> On Feb 16, 2014, at 7:03 PM, EdwardKing <zh...@neusoft.com> wrote:
> 
> I have install hadoop-2.2.0 under two machine,one is master and other is slave,then I start hadoop service under master machine.
> [hadoop@master ~]$./start-dfs.sh
> [hadoop@master ~]$./start-yarn.sh
> [hadoop@master ~]$./mr-jobhistory-daemon.sh start historyserver
>  
> My question is whether I need start hadoop service under slave machine again? Thanks.
> [hadoop@slave ~]$./start-dfs.sh
> [hadoop@slave ~]$./start-yarn.sh
> [hadoop@slave ~]$./mr-jobhistory-daemon.sh start historyserver
>  
>  
>  
>  
>  
>  
> ---------------------------------------------------------------------------------------------------
> Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) 
> is intended only for the use of the intended recipient and may be confidential and/or privileged of 
> Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is 
> not the intended recipient, unauthorized use, forwarding, printing,  storing, disclosure or copying 
> is strictly prohibited, and may be unlawful.If you have received this communication in error,please 
> immediately notify the sender by return e-mail, and delete the original message and all copies from 
> your system. Thank you. 
> ---------------------------------------------------------------------------------------------------