You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Mahmood Naderan <nt...@yahoo.com> on 2015/05/04 07:52:29 UTC
Connection issues
Dear all,My problem with "ipc.Client: Retrying connect to server" is still open! To start a new and clean thread, here is the problem description.
[mahmood@tiger Index]$ which hadoop
~/bigdatabench/apache/hadoop-1.0.2/bin/hadoop[mahmood@tiger Index]$ cat /etc/hosts
127.0.0.1 localhost.localdomain localhost
192.168.1.5 tiger
192.168.1.100 orca
192.168.1.6 zardalou
[mahmood@tiger Index]$ hadoop -jar indexdata.jar `pwd`/result hdfs://127.0.0.1:9000/data-Index
Warning: $HADOOP_HOME is deprecated.
15/05/04 10:14:07 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
Exception in thread "main" java.net.ConnectException: Call to localhost.localdomain/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
at org.apache.hadoop.ipc.Client.call(Client.java:1118)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
at IndexHDFS.indexData(IndexHDFS.java:88)
at IndexHDFS.main(IndexHDFS.java:72)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
at org.apache.hadoop.ipc.Client.call(Client.java:1093)
... 20 more
As you can see there are many "connection refused" errors. So you may suggest to check the firewall and network configs to make sure port #9000 is open. I found a good test method here, http://goo.gl/ZYjoSy
This is a simple socket test program where a client send a text to the server via a port number. I have to say that I ran the program with port #9000 and **it did work successfully**
So,I am sure that there is no problem with the network configs.
Is there any idea on the hdfs://127.0.0.1:9000
??
Regards,
Mahmood
Re: Connection issues
Posted by Mahmood Naderan <nt...@yahoo.com>.
>Try hdfs://hostname:9000
Same error:
15/05/04 11:16:12 INFO ipc.Client: Retrying connect to server: tiger/192.168.1.5:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
Exception in thread "main" java.net.ConnectException: Call to tiger/192.168.1.5:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
According to the hosts file, tiger is mapped to 192.168.1.5.
Even I changed the format of /etc/hosts. At least the command must works with 127.0.0.1 but it doesn't!
Regards,
Mahmood
Re: Connection issues
Posted by Mahmood Naderan <nt...@yahoo.com>.
>Try hdfs://hostname:9000
Same error:
15/05/04 11:16:12 INFO ipc.Client: Retrying connect to server: tiger/192.168.1.5:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
Exception in thread "main" java.net.ConnectException: Call to tiger/192.168.1.5:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
According to the hosts file, tiger is mapped to 192.168.1.5.
Even I changed the format of /etc/hosts. At least the command must works with 127.0.0.1 but it doesn't!
Regards,
Mahmood
Re: Connection issues
Posted by Mahmood Naderan <nt...@yahoo.com>.
>Try hdfs://hostname:9000
Same error:
15/05/04 11:16:12 INFO ipc.Client: Retrying connect to server: tiger/192.168.1.5:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
Exception in thread "main" java.net.ConnectException: Call to tiger/192.168.1.5:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
According to the hosts file, tiger is mapped to 192.168.1.5.
Even I changed the format of /etc/hosts. At least the command must works with 127.0.0.1 but it doesn't!
Regards,
Mahmood
Re: Connection issues
Posted by Mahmood Naderan <nt...@yahoo.com>.
>Try hdfs://hostname:9000
Same error:
15/05/04 11:16:12 INFO ipc.Client: Retrying connect to server: tiger/192.168.1.5:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
Exception in thread "main" java.net.ConnectException: Call to tiger/192.168.1.5:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
According to the hosts file, tiger is mapped to 192.168.1.5.
Even I changed the format of /etc/hosts. At least the command must works with 127.0.0.1 but it doesn't!
Regards,
Mahmood
Re: Connection issues
Posted by Sandeep <sa...@gmail.com>.
Try hdfs://hostname:9000
Also /etc/hosts should be like this
Ipaddress fqdn hostname
Eg:
192.x.x.x Tiger.hadoop.com Tiger
Let me know if that works
Sent from my iPhone
> On May 3, 2015, at 10:52 PM, Mahmood Naderan <nt...@yahoo.com> wrote:
>
> Dear all,
> My problem with "ipc.Client: Retrying connect to server" is still open! To start a new and clean thread, here is the problem description.
>
> [mahmood@tiger Index]$ which hadoop
> ~/bigdatabench/apache/hadoop-1.0.2/bin/hadoop
> [mahmood@tiger Index]$ cat /etc/hosts
> 127.0.0.1 localhost.localdomain localhost
> 192.168.1.5 tiger
> 192.168.1.100 orca
> 192.168.1.6 zardalou
>
>
> [mahmood@tiger Index]$ hadoop -jar indexdata.jar `pwd`/result hdfs://127.0.0.1:9000/data-Index
> Warning: $HADOOP_HOME is deprecated.
> 15/05/04 10:14:07 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
> Exception in thread "main" java.net.ConnectException: Call to localhost.localdomain/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
> at org.apache.hadoop.ipc.Client.call(Client.java:1118)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
> at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
> at IndexHDFS.indexData(IndexHDFS.java:88)
> at IndexHDFS.main(IndexHDFS.java:72)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
> at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
> at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
> at org.apache.hadoop.ipc.Client.call(Client.java:1093)
> ... 20 more
>
>
>
>
>
>
>
> As you can see there are many "connection refused" errors. So you may suggest to check the firewall and network configs to make sure port #9000 is open. I found a good test method here, http://goo.gl/ZYjoSy
> This is a simple socket test program where a client send a text to the server via a port number. I have to say that I ran the program with port #9000 and **it did work successfully**
>
> So,I am sure that there is no problem with the network configs.
>
> Is there any idea on the hdfs://127.0.0.1:9000
>
> ??
>
> Regards,
> Mahmood
Re: Connection issues
Posted by Sandeep <sa...@gmail.com>.
Try hdfs://hostname:9000
Also /etc/hosts should be like this
Ipaddress fqdn hostname
Eg:
192.x.x.x Tiger.hadoop.com Tiger
Let me know if that works
Sent from my iPhone
> On May 3, 2015, at 10:52 PM, Mahmood Naderan <nt...@yahoo.com> wrote:
>
> Dear all,
> My problem with "ipc.Client: Retrying connect to server" is still open! To start a new and clean thread, here is the problem description.
>
> [mahmood@tiger Index]$ which hadoop
> ~/bigdatabench/apache/hadoop-1.0.2/bin/hadoop
> [mahmood@tiger Index]$ cat /etc/hosts
> 127.0.0.1 localhost.localdomain localhost
> 192.168.1.5 tiger
> 192.168.1.100 orca
> 192.168.1.6 zardalou
>
>
> [mahmood@tiger Index]$ hadoop -jar indexdata.jar `pwd`/result hdfs://127.0.0.1:9000/data-Index
> Warning: $HADOOP_HOME is deprecated.
> 15/05/04 10:14:07 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
> Exception in thread "main" java.net.ConnectException: Call to localhost.localdomain/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
> at org.apache.hadoop.ipc.Client.call(Client.java:1118)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
> at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
> at IndexHDFS.indexData(IndexHDFS.java:88)
> at IndexHDFS.main(IndexHDFS.java:72)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
> at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
> at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
> at org.apache.hadoop.ipc.Client.call(Client.java:1093)
> ... 20 more
>
>
>
>
>
>
>
> As you can see there are many "connection refused" errors. So you may suggest to check the firewall and network configs to make sure port #9000 is open. I found a good test method here, http://goo.gl/ZYjoSy
> This is a simple socket test program where a client send a text to the server via a port number. I have to say that I ran the program with port #9000 and **it did work successfully**
>
> So,I am sure that there is no problem with the network configs.
>
> Is there any idea on the hdfs://127.0.0.1:9000
>
> ??
>
> Regards,
> Mahmood
Re: Connection issues
Posted by Sandeep <sa...@gmail.com>.
Try hdfs://hostname:9000
Also /etc/hosts should be like this
Ipaddress fqdn hostname
Eg:
192.x.x.x Tiger.hadoop.com Tiger
Let me know if that works
Sent from my iPhone
> On May 3, 2015, at 10:52 PM, Mahmood Naderan <nt...@yahoo.com> wrote:
>
> Dear all,
> My problem with "ipc.Client: Retrying connect to server" is still open! To start a new and clean thread, here is the problem description.
>
> [mahmood@tiger Index]$ which hadoop
> ~/bigdatabench/apache/hadoop-1.0.2/bin/hadoop
> [mahmood@tiger Index]$ cat /etc/hosts
> 127.0.0.1 localhost.localdomain localhost
> 192.168.1.5 tiger
> 192.168.1.100 orca
> 192.168.1.6 zardalou
>
>
> [mahmood@tiger Index]$ hadoop -jar indexdata.jar `pwd`/result hdfs://127.0.0.1:9000/data-Index
> Warning: $HADOOP_HOME is deprecated.
> 15/05/04 10:14:07 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
> Exception in thread "main" java.net.ConnectException: Call to localhost.localdomain/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
> at org.apache.hadoop.ipc.Client.call(Client.java:1118)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
> at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
> at IndexHDFS.indexData(IndexHDFS.java:88)
> at IndexHDFS.main(IndexHDFS.java:72)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
> at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
> at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
> at org.apache.hadoop.ipc.Client.call(Client.java:1093)
> ... 20 more
>
>
>
>
>
>
>
> As you can see there are many "connection refused" errors. So you may suggest to check the firewall and network configs to make sure port #9000 is open. I found a good test method here, http://goo.gl/ZYjoSy
> This is a simple socket test program where a client send a text to the server via a port number. I have to say that I ran the program with port #9000 and **it did work successfully**
>
> So,I am sure that there is no problem with the network configs.
>
> Is there any idea on the hdfs://127.0.0.1:9000
>
> ??
>
> Regards,
> Mahmood
Re: Connection issues
Posted by Sandeep <sa...@gmail.com>.
Try hdfs://hostname:9000
Also /etc/hosts should be like this
Ipaddress fqdn hostname
Eg:
192.x.x.x Tiger.hadoop.com Tiger
Let me know if that works
Sent from my iPhone
> On May 3, 2015, at 10:52 PM, Mahmood Naderan <nt...@yahoo.com> wrote:
>
> Dear all,
> My problem with "ipc.Client: Retrying connect to server" is still open! To start a new and clean thread, here is the problem description.
>
> [mahmood@tiger Index]$ which hadoop
> ~/bigdatabench/apache/hadoop-1.0.2/bin/hadoop
> [mahmood@tiger Index]$ cat /etc/hosts
> 127.0.0.1 localhost.localdomain localhost
> 192.168.1.5 tiger
> 192.168.1.100 orca
> 192.168.1.6 zardalou
>
>
> [mahmood@tiger Index]$ hadoop -jar indexdata.jar `pwd`/result hdfs://127.0.0.1:9000/data-Index
> Warning: $HADOOP_HOME is deprecated.
> 15/05/04 10:14:07 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
> Exception in thread "main" java.net.ConnectException: Call to localhost.localdomain/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
> at org.apache.hadoop.ipc.Client.call(Client.java:1118)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
> at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
> at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
> at IndexHDFS.indexData(IndexHDFS.java:88)
> at IndexHDFS.main(IndexHDFS.java:72)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
> at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
> at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
> at org.apache.hadoop.ipc.Client.call(Client.java:1093)
> ... 20 more
>
>
>
>
>
>
>
> As you can see there are many "connection refused" errors. So you may suggest to check the firewall and network configs to make sure port #9000 is open. I found a good test method here, http://goo.gl/ZYjoSy
> This is a simple socket test program where a client send a text to the server via a port number. I have to say that I ran the program with port #9000 and **it did work successfully**
>
> So,I am sure that there is no problem with the network configs.
>
> Is there any idea on the hdfs://127.0.0.1:9000
>
> ??
>
> Regards,
> Mahmood