You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Edson Ramiro <er...@gmail.com> on 2010/02/22 14:17:46 UTC

Wrong FS

Hi all,

I'm getting this error

[hadoop@master01 hadoop-0.20.1 ]$ ./bin/hadoop jar
hadoop-0.20.1-examples.jar pi 1 1
Number of Maps  = 1
Samples per Map = 1
Wrote input for Map #0
Starting Job
java.lang.IllegalArgumentException: Wrong FS: hdfs://
10.0.0.101:9000/system/job_201002221311_0001, expected: hdfs://master01:9000

[...]

Do I need to set up a DNS ?

All my nodes are ok and the NameNode isn't in safe mode.

Any Idea?

Thanks in Advance.

Edson Ramiro

Re: Wrong FS

Posted by Edson Ramiro <er...@gmail.com>.
But, when I use the hostname the nodes doesn't find the masters.

Where should I use the hostnames?

and, I don't need to setup DNS, but if I do, does it solve the problem?

Edson Ramiro


On 22 February 2010 10:56, Bill Habermaas <bi...@habermaas.us> wrote:

> This problem has been around for a long time. Hadoop picks up the local
> host
> name for the namenode and it will be used in all URI checks.  You cannot
> mix
> IP and host addresses. This is especially a problem on solaris and aix
> systems where I ran into it.  You don't need to setup DNS, just use the
> hostname in your URIs. I did some patches for this for 0.18 but have not
> redone them for 0.20.
>
> Bill
>
> -----Original Message-----
> From: Edson Ramiro [mailto:erlfilho@gmail.com]
> Sent: Monday, February 22, 2010 8:18 AM
> To: common-user@hadoop.apache.org
> Subject: Wrong FS
>
> Hi all,
>
> I'm getting this error
>
> [hadoop@master01 hadoop-0.20.1 ]$ ./bin/hadoop jar
> hadoop-0.20.1-examples.jar pi 1 1
> Number of Maps  = 1
> Samples per Map = 1
> Wrote input for Map #0
> Starting Job
> java.lang.IllegalArgumentException: Wrong FS: hdfs://
> 10.0.0.101:9000/system/job_201002221311_0001, expected:
> hdfs://master01:9000
>
> [...]
>
> Do I need to set up a DNS ?
>
> All my nodes are ok and the NameNode isn't in safe mode.
>
> Any Idea?
>
> Thanks in Advance.
>
> Edson Ramiro
>
>
>

RE: Wrong FS

Posted by Bill Habermaas <bi...@habermaas.us>.
This problem has been around for a long time. Hadoop picks up the local host
name for the namenode and it will be used in all URI checks.  You cannot mix
IP and host addresses. This is especially a problem on solaris and aix
systems where I ran into it.  You don't need to setup DNS, just use the
hostname in your URIs. I did some patches for this for 0.18 but have not
redone them for 0.20. 

Bill

-----Original Message-----
From: Edson Ramiro [mailto:erlfilho@gmail.com] 
Sent: Monday, February 22, 2010 8:18 AM
To: common-user@hadoop.apache.org
Subject: Wrong FS

Hi all,

I'm getting this error

[hadoop@master01 hadoop-0.20.1 ]$ ./bin/hadoop jar
hadoop-0.20.1-examples.jar pi 1 1
Number of Maps  = 1
Samples per Map = 1
Wrote input for Map #0
Starting Job
java.lang.IllegalArgumentException: Wrong FS: hdfs://
10.0.0.101:9000/system/job_201002221311_0001, expected: hdfs://master01:9000

[...]

Do I need to set up a DNS ?

All my nodes are ok and the NameNode isn't in safe mode.

Any Idea?

Thanks in Advance.

Edson Ramiro



Re: Wrong FS

Posted by Marc Farnum Rendino <mv...@gmail.com>.
On Tue, Feb 23, 2010 at 9:38 AM, Edson Ramiro <er...@gmail.com> wrote:

> Thanks Marc and Bill
>
> I solved this Wrong FS problem editing the /etc/hosts as Marc said.
>
> Now, the cluster is working ok  : ]


Great; 'preciate the confirmation!

- Marc

Re: Wrong FS

Posted by Edson Ramiro <er...@gmail.com>.
Thanks Marc and Bill

I solved this Wrong FS problem editing the /etc/hosts as Marc said.

Now, the cluster is working ok  : ]

master01
127.0.0.1 localhost.localdomain localhost
10.0.0.101 master01
10.0.0.102 master02
10.0.0.200 slave00
10.0.0.201 slave01

master02
127.0.0.1 localhost.localdomain localhost
10.0.0.101 master01
10.0.0.102 master02
10.0.0.200 slave00
10.0.0.201 slave01

slave00
127.0.0.1 slave00 localhost.localdomain localhost
10.0.0.101 master01
10.0.0.102 master02
10.0.0.201 slave01

slave01
127.0.0.1 slave01 localhost.localdomain localhost
10.0.0.101 master01
10.0.0.102 master02
10.0.0.200 slave00


Edson Ramiro


On 22 February 2010 17:56, Marc Farnum Rendino <mv...@gmail.com> wrote:

> Perhaps an /etc/hosts file is sufficient.
>
> However, FWIW, I didn't get it working til I moved to using all the real
> FQDNs.
>
> - Marc
>

Re: Wrong FS

Posted by Marc Farnum Rendino <mv...@gmail.com>.
Perhaps an /etc/hosts file is sufficient.

However, FWIW, I didn't get it working til I moved to using all the real
FQDNs.

- Marc