You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by 冯超 <hs...@gmail.com> on 2010/10/16 15:01:36 UTC

Some connection error when running HADOOP

Dear All,
    I met a problem that when I start the hadoop ,I could not find the slave
nodes , my OS system is CentOS, when run the mapreduce program, it alarms me
that:


10/10/16 20:32:41 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 0 time(s).
10/10/16 20:32:42 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 1 time(s).
10/10/16 20:32:43 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 2 time(s).
10/10/16 20:32:44 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 3 time(s).
10/10/16 20:32:45 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 4 time(s).
10/10/16 20:32:46 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 5 time(s).
10/10/16 20:32:47 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 6 time(s).
10/10/16 20:32:48 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 7 time(s).
10/10/16 20:32:49 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 8 time(s).
10/10/16 20:32:50 INFO ipc.Client: Retrying connect to server: zawc1/
192.168.1.101:9000. Already tried 9 time(s).
java.net.ConnectException: Call to zawc1/192.168.1.101:9000 failed on
connection exception: java.net.ConnectException: Connection refused
        at org.apache.hadoop.ipc.Client.wrapException(Client.java:724)
        at org.apache.hadoop.ipc.Client.call(Client.java:700)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
        at $Proxy0.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:348)
        at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:104)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:176)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:75)
        at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
        at
org.apache.hadoop.examples.PiEstimator.launch(PiEstimator.java:187)
        at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:245)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:252)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at
org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)
        at
org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:61)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
Caused by: java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
        at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
        at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:300)
        at
org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:177)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:801)
        at org.apache.hadoop.ipc.Client.call(Client.java:686)
        ... 31 more

    Who can tell me what is the reason? Thanks for any help...

Re: Some connection error when running HADOOP

Posted by kaiming huang <hk...@gmail.com>.
Yeah,I have met the same problem.
Don't use "localhost" or "127.0.0.1",just use your hostname or exact ip
address!

在 2010年10月16日 下午10:29,冯超 <hs...@gmail.com>写道:

> Thank you very much! I have fixed that problem.
> There is something wrong with the host mapping... the default mapping is
> 127.0.0.1, not the exact ip address.
>
>
> 2010/10/16 Jander g <ja...@gmail.com>
>
> > Hi,
> > Didn't you format the HDFS ? or maybe the HDFS was broken.
> >
> > On Sat, Oct 16, 2010 at 9:01 PM, 冯超 <hs...@gmail.com> wrote:
> >
> > > Dear All,
> > >    I met a problem that when I start the hadoop ,I could not find the
> > slave
> > > nodes , my OS system is CentOS, when run the mapreduce program, it
> alarms
> > > me
> > > that:
> > >
> > >
> > > 10/10/16 20:32:41 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 0 time(s).
> > > 10/10/16 20:32:42 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 1 time(s).
> > > 10/10/16 20:32:43 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 2 time(s).
> > > 10/10/16 20:32:44 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 3 time(s).
> > > 10/10/16 20:32:45 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 4 time(s).
> > > 10/10/16 20:32:46 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 5 time(s).
> > > 10/10/16 20:32:47 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 6 time(s).
> > > 10/10/16 20:32:48 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 7 time(s).
> > > 10/10/16 20:32:49 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 8 time(s).
> > > 10/10/16 20:32:50 INFO ipc.Client: Retrying connect to server: zawc1/
> > > 192.168.1.101:9000. Already tried 9 time(s).
> > > java.net.ConnectException: Call to zawc1/192.168.1.101:9000 failed on
> > > connection exception: java.net.ConnectException: Connection refused
> > >        at org.apache.hadoop.ipc.Client.wrapException(Client.java:724)
> > >        at org.apache.hadoop.ipc.Client.call(Client.java:700)
> > >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
> > >        at $Proxy0.getProtocolVersion(Unknown Source)
> > >        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:348)
> > >        at
> > > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:104)
> > >        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:176)
> > >        at
> > >
> > >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:75)
> > >        at
> > > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
> > >        at
> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
> > >        at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
> > >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
> > >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
> > >        at
> > > org.apache.hadoop.examples.PiEstimator.launch(PiEstimator.java:187)
> > >        at
> > org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:245)
> > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > >        at
> > org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:252)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >        at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > >        at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:597)
> > >        at
> > >
> > >
> >
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> > >        at
> > > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)
> > >        at
> > > org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:61)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >        at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > >        at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:597)
> > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
> > >        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > >        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> > > Caused by: java.net.ConnectException: Connection refused
> > >        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> > >        at
> > > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
> > >        at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
> > >        at
> > > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:300)
> > >        at
> > > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:177)
> > >        at org.apache.hadoop.ipc.Client.getConnection(Client.java:801)
> > >        at org.apache.hadoop.ipc.Client.call(Client.java:686)
> > >        ... 31 more
> > >
> > >    Who can tell me what is the reason? Thanks for any help...
> > >
> >
> >
> >
> > --
> > Thanks,
> > Jander
> >
>

Re: Some connection error when running HADOOP

Posted by 冯超 <hs...@gmail.com>.
Thank you very much! I have fixed that problem.
There is something wrong with the host mapping... the default mapping is
127.0.0.1, not the exact ip address.


2010/10/16 Jander g <ja...@gmail.com>

> Hi,
> Didn't you format the HDFS ? or maybe the HDFS was broken.
>
> On Sat, Oct 16, 2010 at 9:01 PM, 冯超 <hs...@gmail.com> wrote:
>
> > Dear All,
> >    I met a problem that when I start the hadoop ,I could not find the
> slave
> > nodes , my OS system is CentOS, when run the mapreduce program, it alarms
> > me
> > that:
> >
> >
> > 10/10/16 20:32:41 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 0 time(s).
> > 10/10/16 20:32:42 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 1 time(s).
> > 10/10/16 20:32:43 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 2 time(s).
> > 10/10/16 20:32:44 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 3 time(s).
> > 10/10/16 20:32:45 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 4 time(s).
> > 10/10/16 20:32:46 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 5 time(s).
> > 10/10/16 20:32:47 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 6 time(s).
> > 10/10/16 20:32:48 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 7 time(s).
> > 10/10/16 20:32:49 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 8 time(s).
> > 10/10/16 20:32:50 INFO ipc.Client: Retrying connect to server: zawc1/
> > 192.168.1.101:9000. Already tried 9 time(s).
> > java.net.ConnectException: Call to zawc1/192.168.1.101:9000 failed on
> > connection exception: java.net.ConnectException: Connection refused
> >        at org.apache.hadoop.ipc.Client.wrapException(Client.java:724)
> >        at org.apache.hadoop.ipc.Client.call(Client.java:700)
> >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
> >        at $Proxy0.getProtocolVersion(Unknown Source)
> >        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:348)
> >        at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:104)
> >        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:176)
> >        at
> >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:75)
> >        at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
> >        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
> >        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
> >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
> >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
> >        at
> > org.apache.hadoop.examples.PiEstimator.launch(PiEstimator.java:187)
> >        at
> org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:245)
> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >        at
> org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:252)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> >
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> >        at
> > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)
> >        at
> > org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:61)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
> >        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> >        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> > Caused by: java.net.ConnectException: Connection refused
> >        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> >        at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
> >        at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
> >        at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:300)
> >        at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:177)
> >        at org.apache.hadoop.ipc.Client.getConnection(Client.java:801)
> >        at org.apache.hadoop.ipc.Client.call(Client.java:686)
> >        ... 31 more
> >
> >    Who can tell me what is the reason? Thanks for any help...
> >
>
>
>
> --
> Thanks,
> Jander
>

Re: Some connection error when running HADOOP

Posted by Jander g <ja...@gmail.com>.
Hi,
Didn't you format the HDFS ? or maybe the HDFS was broken.

On Sat, Oct 16, 2010 at 9:01 PM, 冯超 <hs...@gmail.com> wrote:

> Dear All,
>    I met a problem that when I start the hadoop ,I could not find the slave
> nodes , my OS system is CentOS, when run the mapreduce program, it alarms
> me
> that:
>
>
> 10/10/16 20:32:41 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 0 time(s).
> 10/10/16 20:32:42 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 1 time(s).
> 10/10/16 20:32:43 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 2 time(s).
> 10/10/16 20:32:44 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 3 time(s).
> 10/10/16 20:32:45 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 4 time(s).
> 10/10/16 20:32:46 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 5 time(s).
> 10/10/16 20:32:47 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 6 time(s).
> 10/10/16 20:32:48 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 7 time(s).
> 10/10/16 20:32:49 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 8 time(s).
> 10/10/16 20:32:50 INFO ipc.Client: Retrying connect to server: zawc1/
> 192.168.1.101:9000. Already tried 9 time(s).
> java.net.ConnectException: Call to zawc1/192.168.1.101:9000 failed on
> connection exception: java.net.ConnectException: Connection refused
>        at org.apache.hadoop.ipc.Client.wrapException(Client.java:724)
>        at org.apache.hadoop.ipc.Client.call(Client.java:700)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>        at $Proxy0.getProtocolVersion(Unknown Source)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:348)
>        at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:104)
>        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:176)
>        at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:75)
>        at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
>        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
>        at
> org.apache.hadoop.examples.PiEstimator.launch(PiEstimator.java:187)
>        at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:245)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:252)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>        at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:141)
>        at
> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:61)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
>        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> Caused by: java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
>        at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
>        at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:300)
>        at
> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:177)
>        at org.apache.hadoop.ipc.Client.getConnection(Client.java:801)
>        at org.apache.hadoop.ipc.Client.call(Client.java:686)
>        ... 31 more
>
>    Who can tell me what is the reason? Thanks for any help...
>



-- 
Thanks,
Jander