You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by 张茂森 <ma...@alibaba-inc.com> on 2007/03/01 12:15:54 UTC

答复: Running hadoop in 2 systems

Do you have made the ssh-key pair and send the public key to the slave?

-----邮件原件-----
发件人: Jayalakshmi.Muniasamy@cognizant.com
[mailto:Jayalakshmi.Muniasamy@cognizant.com] 
发送时间: 2007年3月1日 17:32
收件人: hadoop-user@lucene.apache.org
主题: Running hadoop in 2 systems



Hello,

I'm evaluating Hadoop for a large application.

When running the wordcount example, I experience an issue where my  
master node cannot open a socket to port 50010 of my remote slave node.

When I run the example with only my master in the slaves file, it  
works fine.  When I add a second machine, i get the error.

Here is my config:

Running Hadoop-0.11.0

Server for master (10.229.62.6)
Remote slave (10.229.62.56)

My conf/slaves file content

===================
localhost
146736@10.229.62.56
====================

My masters file content

==============
localhost
==============

I've set the environmental variable for HADOOP_HOME and JAVA_HOME


Using the standard hadoop-default.xml.

Here's my hadoop-site.xml (which is the same on both machines):

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>
<name>fs.default.name</name>
<value>10.229.62.6:50010</value>
</property>

<property>
<name>mapred.job.tracker</name>
<value>10.229.62.6:50011</value>
</property>

<property>
<name>dfs.replication</name>
<value>2</value>
</property>

<property>
<name>dfs.datanode.port</name>
<value>50010</value>
</property>

<property>
<name>dfs.info.port</name>
<value>50070</value>
</property>

<property>
<name>dfs.name.dir</name>
<value>/tmp/hadoop-146736/dfs/name</value>
</property>

<property>
<name>dfs.data.dir</name>
<value>/tmp/hadoop-146736/dfs/data</value>
</property>

<property>
<name>dfs.client.buffer.dir</name>
<value>/tmp/hadoop-146736/dfs/tmp</value>
</property>

<property>
<name>mapred.local.dir</name>
<value>/tmp/hadoop-jaya/mapred/local</value>
</property>

<property>
<name>mapred.system.dir</name>
<value>/tmp/hadoop-jaya/mapred/system</value>
</property>

<property>
<name>mapred.temp.dir</name>
<value>/tmp/hadoop-jaya/mapred/temp</value>
</property>

<property>
<name>mapred.job.tracker.info.port</name>
<value>50030</value>
</property>

<property>
<name>mapred.task.tracker.report.port</name>
<value>50050</value>
</property>

</configuration>

Here is the terminal output on the server:

146736@localhost hadoop-0.11.0]$ bin/hadoop dfs input
07/03/01 14:43:22 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 1 time(s).
07/03/01 14:43:23 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 2 time(s).
07/03/01 14:43:24 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 3 time(s).
07/03/01 14:43:25 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 4 time(s).
07/03/01 14:43:26 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 5 time(s).
07/03/01 14:43:27 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 6 time(s).
07/03/01 14:43:28 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 7 time(s).
07/03/01 14:43:29 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 8 time(s).
07/03/01 14:43:30 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 9 time(s).
07/03/01 14:43:31 INFO ipc.Client: Retrying connect to server:
/10.229.62.6:50010. Already tried 10 time(s).
Bad connection to FS. command aborted.

Can anyone help me identify the issue? am i not doing any other work which
is required... Please help me finding and solving the issue...

Thanks & Regards,
Jayalakshmi


This e-mail and any files transmitted with it are for the sole use of the
intended recipient(s) and may contain confidential and privileged
information.
If you are not the intended recipient, please contact the sender by reply
e-mail and destroy all copies of the original message. 
Any unauthorized review, use, disclosure, dissemination, forwarding,
printing or copying of this email or any action taken in reliance on this
e-mail is strictly 
prohibited and may be unlawful.