You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Jean-Marc Spaggiari <je...@spaggiari.org> on 2013/05/16 18:45:47 UTC

Re: Getting this exception "java.net.ConnectException: Call From ubuntu/127.0.0.1 to ubuntu:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused"

Then let's move the discussion on the hadoop user mailing list ;)

>From what I can see on you feedbacks, it seems that hadoop is simply not
running. Are you able to see any process? Do anything?

JM



>
> ---------- Forwarded message ----------
> From: Balakrishna Dhanekula <ba...@gmail.com>
> Date: 2013/5/16
> Subject: Re: Getting this exception "java.net.ConnectException: Call From
> ubuntu/127.0.0.1 to ubuntu:8020 failed on connection exception:
> java.net.ConnectException: Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused"
>
>
>
> Hi JM,
>
> Actually I am using Apache Hadoop 1.1.1 not the cloudera installer.
>
> When I run the file system check I get the same error.
>
> Thanks,
> Bala.
>
>
> On Thu, May 16, 2013 at 7:47 AM, Jean-Marc Spaggiari <jm...@cloudera.com>wrote:
>
>> Hi Bala,
>>
>> Before trying the pi example, have you verified that you hadoop server is
>> working fine? What do you have in the logs? What on the Hadoop interface?
>> Is Cloudera Manager telling you that hadoop is up and running?
>>
>> Are simple commands like hadoop fs -ls working fine?
>>
>> JM
>>
>>
>> 2013/5/15 Balakrishna Dhanekula <ba...@gmail.com>
>>
>>> Hi Jean-Marc Spaggiari,
>>>
>>> Thanks for taking interest and replying.
>>>
>>> I have changed the the hosts entries as
>>>
>>> root@ubuntu:/etc# cat hosts
>>> 127.0.0.1 localhost
>>> 192.168.1.102 ubuntu
>>>
>>> Moreover I have a dynamic ip provided by my Internet provider. Every
>>> time I connect to Internet I get a new ip address.
>>>
>>> I am still having the same problem here's the error:
>>>
>>> ******************************************************************************************************************************************
>>>
>>> hduser@ubuntu:~/hadoop111/bin$ hadoop jar ../hadoop-examples-1.1.1.jar
>>> pi 3 10
>>> Number of Maps  = 3
>>> Samples per Map = 10
>>> java.net.ConnectException: Call From ubuntu/192.168.1.102 to
>>> ubuntu:8020 failed on connection exception: java.net.ConnectException:
>>> Connection refused; For more details see:
>>> http://wiki.apache.org/hadoop/ConnectionRefused
>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>>
>>> ******************************************************************************************************************
>>>
>>>
>>>
>>> Thanks,
>>> Bala.
>>>
>>>
>>> On Wednesday, May 15, 2013 11:21:57 PM UTC+5:30, Jean-Marc Spaggiari
>>> wrote:
>>>
>>>>  Hi Balakrishna,
>>>>
>>>> In your host file, please replace the 2nd 127.0.0.1 by your local IP
>>>> and retry.
>>>>
>>>> Like:
>>>> 127.0.0.1 localhost
>>>> 192.168.1.2 ubuntu
>>>>
>>>> You can also add the FQDN:
>>>> 127.0.0.1 localhost
>>>> 192.168.1.2 ubuntu     ubuntu.my.domain.name
>>>>
>>>> JM
>>>>
>>>>
>>>> 2013/5/15 Balakrishna Dhanekula <balakrishna....@**gmail.com>
>>>>
>>>> Hi,
>>>>>
>>>>> When trying to run the job I am getting this exception. I am using
>>>>> Apache Hadoop 1.1.1 version. I could see all the services started
>>>>> successfully.
>>>>>
>>>>> Can some one please help me.
>>>>>
>>>>> My entries in the /etc/hosts file are:
>>>>>
>>>>> ************************************************
>>>>> 127.0.0.1 localhost
>>>>> 127.0.0.1 ubuntu
>>>>>
>>>>> # The following lines are desirable for IPv6 capable hosts
>>>>> ::1 ip6-localhost ip6-loopback
>>>>> fe00::0 ip6-localnet
>>>>> ff00::0 ip6-mcastprefix
>>>>> ff02::1 ip6-allnodes
>>>>> ff02::2 ip6-allrouters
>>>>> ff02::3 ip6-allhosts
>>>>> ****************************************************************
>>>>> ********************
>>>>>
>>>>> hduser@ubuntu:~/hadoop111/bin$ hadoop jar
>>>>> ../hadoop-examples-1.1.1.jar pi 3 10
>>>>> Number of Maps  = 3
>>>>> Samples per Map = 10
>>>>> java.net.ConnectException: Call From ubuntu/127.0.0.1 to ubuntu:8020
>>>>> failed on connection exception: java.net.ConnectException: Connection
>>>>> refused; For more details see:  http://wiki.apache.org/hadoop/**
>>>>> ConnectionRefused <http://wiki.apache.org/hadoop/ConnectionRefused>
>>>>>     at sun.reflect.**NativeConstructorAccessorImpl.**newInstance0(Native
>>>>> Method)
>>>>>     at sun.reflect.**NativeConstructorAccessorImpl.**newInstance(**
>>>>> NativeConstructorAccessorImpl.**java:39)
>>>>>     at sun.reflect.**DelegatingConstructorAccessorI**mpl.newInstance(*
>>>>> *DelegatingConstructorAccessorI**mpl.java:27)
>>>>>     at java.lang.reflect.Constructor.**newInstance(Constructor.java:**
>>>>> 513)
>>>>>     at org.apache.hadoop.net.**NetUtils.wrapWithMessage(**
>>>>> NetUtils.java:779)
>>>>>     at org.apache.hadoop.net.**NetUtils.wrapException(**
>>>>> NetUtils.java:726)
>>>>>     at org.apache.hadoop.ipc.Client.**call(Client.java:1229)
>>>>>     at org.apache.hadoop.ipc.**ProtobufRpcEngine$Invoker.**
>>>>> invoke(ProtobufRpcEngine.java:**202)
>>>>>     at $Proxy9.getFileInfo(Unknown Source)
>>>>>     at sun.reflect.**NativeMethodAccessorImpl.**invoke0(Native Method)
>>>>>     at sun.reflect.**NativeMethodAccessorImpl.**invoke(**
>>>>> NativeMethodAccessorImpl.java:**39)
>>>>>     at sun.reflect.**DelegatingMethodAccessorImpl.**invoke(**
>>>>> DelegatingMethodAccessorImpl.**java:25)
>>>>>     at java.lang.reflect.Method.**invoke(Method.java:597)
>>>>>     at org.apache.hadoop.io.retry.**RetryInvocationHandler.**
>>>>> invokeMethod(**RetryInvocationHandler.java:**164)
>>>>>     at org.apache.hadoop.io.retry.**RetryInvocationHandler.invoke(**
>>>>> RetryInvocationHandler.java:**83)
>>>>>     at $Proxy9.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.**protocolPB.**
>>>>> ClientNamenodeProtocolTranslat**orPB.getFileInfo(**
>>>>> ClientNamenodeProtocolTranslat**orPB.java:628)
>>>>>     at org.apache.hadoop.hdfs.**DFSClient.getFileInfo(**
>>>>> DFSClient.java:1545)
>>>>>     at org.apache.hadoop.hdfs.**DistributedFileSystem.**getFileStatus(
>>>>> **DistributedFileSystem.java:**805)
>>>>>     at org.apache.hadoop.fs.**FileSystem.exists(FileSystem.**
>>>>> java:1367)
>>>>>     at org.apache.hadoop.examples.**PiEstimator.estimate(**
>>>>> PiEstimator.java:269)
>>>>>     at org.apache.hadoop.examples.**PiEstimator.run(PiEstimator.**
>>>>> java:342)
>>>>>     at org.apache.hadoop.util.**ToolRunner.run(ToolRunner.**java:70)
>>>>>     at org.apache.hadoop.examples.**PiEstimator.main(PiEstimator.**
>>>>> java:351)
>>>>>     at sun.reflect.**NativeMethodAccessorImpl.**invoke0(Native Method)
>>>>>     at sun.reflect.**NativeMethodAccessorImpl.**invoke(**
>>>>> NativeMethodAccessorImpl.java:**39)
>>>>>     at sun.reflect.**DelegatingMethodAccessorImpl.**invoke(**
>>>>> DelegatingMethodAccessorImpl.**java:25)
>>>>>     at java.lang.reflect.Method.**invoke(Method.java:597)
>>>>>     at org.apache.hadoop.util.**ProgramDriver$**
>>>>> ProgramDescription.invoke(**ProgramDriver.java:72)
>>>>>     at org.apache.hadoop.util.**ProgramDriver.driver(**
>>>>> ProgramDriver.java:144)
>>>>>     at org.apache.hadoop.examples.**ExampleDriver.main(**
>>>>> ExampleDriver.java:64)
>>>>>     at sun.reflect.**NativeMethodAccessorImpl.**invoke0(Native Method)
>>>>>     at sun.reflect.**NativeMethodAccessorImpl.**invoke(**
>>>>> NativeMethodAccessorImpl.java:**39)
>>>>>     at sun.reflect.**DelegatingMethodAccessorImpl.**invoke(**
>>>>> DelegatingMethodAccessorImpl.**java:25)
>>>>>     at java.lang.reflect.Method.**invoke(Method.java:597)
>>>>>     at org.apache.hadoop.util.RunJar.**main(RunJar.java:208)
>>>>> Caused by: java.net.ConnectException: Connection refused
>>>>>     at sun.nio.ch.SocketChannelImpl.**checkConnect(Native Method)
>>>>>     at sun.nio.ch.SocketChannelImpl.**finishConnect(**
>>>>> SocketChannelImpl.java:567)
>>>>>     at org.apache.hadoop.net.**SocketIOWithTimeout.connect(**
>>>>> SocketIOWithTimeout.java:207)
>>>>>     at org.apache.hadoop.net.**NetUtils.connect(NetUtils.**java:525)
>>>>>     at org.apache.hadoop.net.**NetUtils.connect(NetUtils.**java:489)
>>>>>     at org.apache.hadoop.ipc.Client$**Connection.setupConnection(**
>>>>> Client.java:499)
>>>>>     at org.apache.hadoop.ipc.Client$**Connection.setupIOstreams(**
>>>>> Client.java:593)
>>>>>     at org.apache.hadoop.ipc.Client$**Connection.access$2000(Client.**
>>>>> java:241)
>>>>>     at org.apache.hadoop.ipc.Client.**getConnection(Client.java:**
>>>>> 1278)
>>>>>     at org.apache.hadoop.ipc.Client.**call(Client.java:1196)
>>>>>     ... 29 more
>>>>> hduser@ubuntu:~/hadoop111/bin$
>>>>>
>>>>>
>>>>
>>
>
>