You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Xiaobo Gu <gu...@gmail.com> on 2011/07/01 17:34:44 UTC

How to run multiple data nodes and multiple task trackers on single server.

Hi:
   I am following the guides in
http://hadoop-karma.blogspot.com/2010/05/hadoop-cookbook-4-how-to-run-multiple.html
with Hadoop 0.20.203.0 on Solaris 10u9 X64, but I have revised the
hadoop-daemon.sh call to

hadoop-daemon.sh  $1 datanode $DN_CONF_OPTS

But I failed with error messages as following:


-bash-3.00$ run-additionalDN.sh start 1
starting datanode, logging to
/data/hdp/additionalDN/1/logs/hadoop-gpadmin-datanode-solsvr2.out
Usage: java DataNode
           [-rollback]

Regards,

Xiaobo Gu

RE: How to run multiple data nodes and multiple task trackers on single server.

Posted by XiaoboGu <gu...@gmail.com>.
Hi,

Do we have to run multiple task trackers when running multiple data nodes on a single computer?

Regards,

Xiaobo Gu

> -----Original Message-----
> From: Xiaobo Gu [mailto:guxiaobo1982@gmail.com]
> Sent: Friday, July 01, 2011 11:35 PM
> To: hdfs-user@hadoop.apache.org
> Subject: How to run multiple data nodes and multiple task trackers on single server.
> 
> Hi:
>    I am following the guides in
> http://hadoop-karma.blogspot.com/2010/05/hadoop-cookbook-4-how-to-run-multiple.html
> with Hadoop 0.20.203.0 on Solaris 10u9 X64, but I have revised the
> hadoop-daemon.sh call to
> 
> hadoop-daemon.sh  $1 datanode $DN_CONF_OPTS
> 
> But I failed with error messages as following:
> 
> 
> -bash-3.00$ run-additionalDN.sh start 1
> starting datanode, logging to
> /data/hdp/additionalDN/1/logs/hadoop-gpadmin-datanode-solsvr2.out
> Usage: java DataNode
>            [-rollback]
> 
> Regards,
> 
> Xiaobo Gu


RE: How to run multiple data nodes and multiple task trackers on single server.

Posted by XiaoboGu <gu...@gmail.com>.
Hi,

Do we have to run multiple task trackers when running multiple data nodes on a single computer?

Regards,

Xiaobo Gu

> -----Original Message-----
> From: Xiaobo Gu [mailto:guxiaobo1982@gmail.com]
> Sent: Friday, July 01, 2011 11:35 PM
> To: hdfs-user@hadoop.apache.org
> Subject: How to run multiple data nodes and multiple task trackers on single server.
> 
> Hi:
>    I am following the guides in
> http://hadoop-karma.blogspot.com/2010/05/hadoop-cookbook-4-how-to-run-multiple.html
> with Hadoop 0.20.203.0 on Solaris 10u9 X64, but I have revised the
> hadoop-daemon.sh call to
> 
> hadoop-daemon.sh  $1 datanode $DN_CONF_OPTS
> 
> But I failed with error messages as following:
> 
> 
> -bash-3.00$ run-additionalDN.sh start 1
> starting datanode, logging to
> /data/hdp/additionalDN/1/logs/hadoop-gpadmin-datanode-solsvr2.out
> Usage: java DataNode
>            [-rollback]
> 
> Regards,
> 
> Xiaobo Gu


RE: How to run multiple data nodes and multiple task trackers on single server.

Posted by Brahma Reddy <br...@huawei.com>.
HI Xiaobo Gu,

We can run the multiple data nodes in single machine by using the following
configurations

i)In the hadoop/conf/haddop-env.sh

Give the different directories for each cluster to the following property

For Example:
------------

#the directory where pid files are stored, by default /tmp

export HADOOP_PID_DIR=/var/processname 

ii)We have to give the listening ports differently

For Example I am using the free ports aviable in the machine in the
following example

<property>
<name>dfs.datanode.address</name>
<value>0.0.0.0:0</value>
<description>
The address where the datanode server listen to
</description>
</property>

<property>
<name>dfs.datanode.http.address</name>
<value>0.0.0.0:0</value>
<description>
The datanode http server address and port</description>
</property>

<property>
<name>dfs.datanode.ipc.address</name>
<value>0.0.0.0:0</value>
<description>
The datanode puc server address and port 
</description>
</property>

Similarly we can give different ports for namenode,secondary
namenode,jobtracker and tasktracker


****************************************************************************
***********
This e-mail and attachments contain confidential information from HUAWEI,
which is intended only for the person or entity whose address is listed
above. Any use of the information contained herein in any way (including,
but not limited to, total or partial disclosure, reproduction, or
dissemination) by persons other than the intended recipient's) is
prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!

-----Original Message-----
From: Xiaobo Gu [mailto:guxiaobo1982@gmail.com] 
Sent: Friday, July 01, 2011 9:05 PM
To: hdfs-user@hadoop.apache.org
Subject: How to run multiple data nodes and multiple task trackers on single
server.

Hi:
   I am following the guides in
http://hadoop-karma.blogspot.com/2010/05/hadoop-cookbook-4-how-to-run-multip
le.html
with Hadoop 0.20.203.0 on Solaris 10u9 X64, but I have revised the
hadoop-daemon.sh call to

hadoop-daemon.sh  $1 datanode $DN_CONF_OPTS

But I failed with error messages as following:


-bash-3.00$ run-additionalDN.sh start 1
starting datanode, logging to
/data/hdp/additionalDN/1/logs/hadoop-gpadmin-datanode-solsvr2.out
Usage: java DataNode
           [-rollback]

Regards,

Xiaobo Gu