You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by bharath vissapragada <bh...@students.iiit.ac.in> on 2009/06/23 15:42:29 UTC

UnknownHostException

when i try to execute the command bin/start-dfs.sh  , i get the following
error . I have checked the hadoop-site.xml file on all the nodes , and they
are fine ..
can some-one help me out!

10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
unknown host: 10.2.24.21.
10.2.24.21:     at
org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
10.2.24.21:     at
org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
10.2.24.21:     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
10.2.24.21:     at org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
Source)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)

RE: UnknownHostException

Posted by zjffdu <zj...@gmail.com>.
I encountered this problem before.  If t you can ping the machine using its
name, but cannot ping it using its IP address.

then what you have to do is add the mapping into /etc/hosts



-----Original Message-----
From: bharathvissapragada1990@gmail.com
[mailto:bharathvissapragada1990@gmail.com] On Behalf Of bharath vissapragada
Sent: 2009年6月23日 6:42
To: core-user@hadoop.apache.org
Subject: UnknownHostException

when i try to execute the command bin/start-dfs.sh  , i get the following
error . I have checked the hadoop-site.xml file on all the nodes , and they
are fine ..
can some-one help me out!

10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
unknown host: 10.2.24.21.
10.2.24.21:     at
org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
10.2.24.21:     at
org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
10.2.24.21:     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
10.2.24.21:     at org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
Source)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)


Re: UnknownHostException

Posted by bharath vissapragada <bh...@gmail.com>.
namenode is stopping automatically!!

On Tue, Jun 23, 2009 at 10:29 PM, bharath vissapragada <
bharathvissapragada1990@gmail.com> wrote:

> It worked fine when i updated /etc/hosts file (of all the slaves) and
> writing fully qualified domain name in the hadoop-site.xml.
>
> It worked fine for sometime .. then started giving new error
>
> 09/06/23 22:21:49 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 0 time(s).
> 09/06/23 22:21:50 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 1 time(s).
> 09/06/23 22:21:51 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 2 time(s).
> 09/06/23 22:21:52 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 3 time(s).
> 09/06/23 22:21:53 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 4 time(s).
> 09/06/23 22:21:54 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 5 time(s).
> 09/06/23 22:21:55 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 6 time(s).
> 09/06/23 22:21:56 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 7 time(s).
> 09/06/23 22:21:57 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 8 time(s).
> 09/06/23 22:21:58 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 9 time(s).
>
>
>
> On Tue, Jun 23, 2009 at 8:33 PM, Raghu Angadi <ra...@yahoo-inc.com>wrote:
>
>> Raghu Angadi wrote:
>>
>>>
>>> This is at RPC client level and there is requirement for fully qualified
>>>
>>
>> I meant to say "there is NO requirement ..."
>>
>>  hostname. May be "." at the end of "10.2.24.21" causing the problem?
>>>
>>> btw, in 0.21 even fs.default.name does not need to be fully qualified
>>>
>>
>> that fix is probably in 0.20 too.
>>
>> Raghu.
>>
>>
>>  name.. anything that resolves to an ipaddress is fine (at least for
>>> common/FS and HDFS).
>>>
>>> Raghu.
>>>
>>> Matt Massie wrote:
>>>
>>>> fs.default.name in your hadoop-site.xml needs to be set to a
>>>> fully-qualified domain name (instead of an IP address)
>>>>
>>>> -Matt
>>>>
>>>> On Jun 23, 2009, at 6:42 AM, bharath vissapragada wrote:
>>>>
>>>>  when i try to execute the command bin/start-dfs.sh  , i get the
>>>>> following
>>>>> error . I have checked the hadoop-site.xml file on all the nodes , and
>>>>> they
>>>>> are fine ..
>>>>> can some-one help me out!
>>>>>
>>>>> 10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
>>>>> unknown host: 10.2.24.21.
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
>>>>> Source)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: UnknownHostException

Posted by bharath vissapragada <bh...@gmail.com>.
It worked fine when i updated /etc/hosts file (of all the slaves) and
writing fully qualified domain name in the hadoop-site.xml.

It worked fine for sometime .. then started giving new error

09/06/23 22:21:49 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 0 time(s).
09/06/23 22:21:50 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 1 time(s).
09/06/23 22:21:51 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 2 time(s).
09/06/23 22:21:52 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 3 time(s).
09/06/23 22:21:53 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 4 time(s).
09/06/23 22:21:54 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 5 time(s).
09/06/23 22:21:55 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 6 time(s).
09/06/23 22:21:56 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 7 time(s).
09/06/23 22:21:57 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 8 time(s).
09/06/23 22:21:58 INFO ipc.Client: Retrying connect to server: master/
10.2.24.21:54310. Already tried 9 time(s).


On Tue, Jun 23, 2009 at 8:33 PM, Raghu Angadi <ra...@yahoo-inc.com> wrote:

> Raghu Angadi wrote:
>
>>
>> This is at RPC client level and there is requirement for fully qualified
>>
>
> I meant to say "there is NO requirement ..."
>
>  hostname. May be "." at the end of "10.2.24.21" causing the problem?
>>
>> btw, in 0.21 even fs.default.name does not need to be fully qualified
>>
>
> that fix is probably in 0.20 too.
>
> Raghu.
>
>
>  name.. anything that resolves to an ipaddress is fine (at least for
>> common/FS and HDFS).
>>
>> Raghu.
>>
>> Matt Massie wrote:
>>
>>> fs.default.name in your hadoop-site.xml needs to be set to a
>>> fully-qualified domain name (instead of an IP address)
>>>
>>> -Matt
>>>
>>> On Jun 23, 2009, at 6:42 AM, bharath vissapragada wrote:
>>>
>>>  when i try to execute the command bin/start-dfs.sh  , i get the
>>>> following
>>>> error . I have checked the hadoop-site.xml file on all the nodes , and
>>>> they
>>>> are fine ..
>>>> can some-one help me out!
>>>>
>>>> 10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
>>>> unknown host: 10.2.24.21.
>>>> 10.2.24.21:     at
>>>> org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
>>>> 10.2.24.21:     at
>>>> org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
>>>> 10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
>>>> 10.2.24.21:     at
>>>> org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>>>> 10.2.24.21:     at
>>>> org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
>>>> Source)
>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>>>>
>>>
>>>
>>
>>
>

Re: UnknownHostException

Posted by Raghu Angadi <ra...@yahoo-inc.com>.
Raghu Angadi wrote:
> 
> This is at RPC client level and there is requirement for fully qualified 

I meant to say "there is NO requirement ..."

> hostname. May be "." at the end of "10.2.24.21" causing the problem?
> 
> btw, in 0.21 even fs.default.name does not need to be fully qualified

that fix is probably in 0.20 too.

Raghu.

> name.. anything that resolves to an ipaddress is fine (at least for 
> common/FS and HDFS).
> 
> Raghu.
> 
> Matt Massie wrote:
>> fs.default.name in your hadoop-site.xml needs to be set to a 
>> fully-qualified domain name (instead of an IP address)
>>
>> -Matt
>>
>> On Jun 23, 2009, at 6:42 AM, bharath vissapragada wrote:
>>
>>> when i try to execute the command bin/start-dfs.sh  , i get the 
>>> following
>>> error . I have checked the hadoop-site.xml file on all the nodes , 
>>> and they
>>> are fine ..
>>> can some-one help me out!
>>>
>>> 10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
>>> unknown host: 10.2.24.21.
>>> 10.2.24.21:     at
>>> org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
>>> 10.2.24.21:     at
>>> org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
>>> 10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
>>> 10.2.24.21:     at 
>>> org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>>> 10.2.24.21:     at 
>>> org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
>>> Source)
>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>>
> 
> 


Re: UnknownHostException

Posted by Raghu Angadi <ra...@yahoo-inc.com>.
This is at RPC client level and there is requirement for fully qualified 
hostname. May be "." at the end of "10.2.24.21" causing the problem?

btw, in 0.21 even fs.default.name does not need to be fully qualified 
name.. anything that resolves to an ipaddress is fine (at least for 
common/FS and HDFS).

Raghu.

Matt Massie wrote:
> fs.default.name in your hadoop-site.xml needs to be set to a 
> fully-qualified domain name (instead of an IP address)
> 
> -Matt
> 
> On Jun 23, 2009, at 6:42 AM, bharath vissapragada wrote:
> 
>> when i try to execute the command bin/start-dfs.sh  , i get the following
>> error . I have checked the hadoop-site.xml file on all the nodes , and 
>> they
>> are fine ..
>> can some-one help me out!
>>
>> 10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
>> unknown host: 10.2.24.21.
>> 10.2.24.21:     at
>> org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
>> 10.2.24.21:     at
>> org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
>> 10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>> 10.2.24.21:     at 
>> org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
>> Source)
>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
> 


Re: UnknownHostException

Posted by Matt Massie <ma...@cloudera.com>.
fs.default.name in your hadoop-site.xml needs to be set to a fully- 
qualified domain name (instead of an IP address)

-Matt

On Jun 23, 2009, at 6:42 AM, bharath vissapragada wrote:

> when i try to execute the command bin/start-dfs.sh  , i get the  
> following
> error . I have checked the hadoop-site.xml file on all the nodes ,  
> and they
> are fine ..
> can some-one help me out!
>
> 10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
> unknown host: 10.2.24.21.
> 10.2.24.21:     at
> org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
> 10.2.24.21:     at
> org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
> 10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
> 10.2.24.21:     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 
> 216)
> 10.2.24.21:     at org.apache.hadoop.dfs. 
> $Proxy4.getProtocolVersion(Unknown
> Source)
> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java: 
> 288)