You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Paolo Di Tommaso <pa...@gmail.com> on 2011/11/08 11:54:18 UTC

Hadoop PseudoDistributed configuration

Dear all,

I'm trying to install Hadoop (0.20.2) in pseudo distributed mode to run
some tests on a Linux machine (Fedora 8) .

I have followed the installation steps in the guide available here
http://hadoop.apache.org/common/docs/current/single_node_setup.html#PseudoDistributed


The daemons start with no problem, but when I access the HDFS file
system (hadoop
fs -ls /) it shows all the content of the underling (real) file system.
This seems really strange to me because I'm expecting that HDFS should work
as an independent file system.


Does anybody had the same problem? Any suggestions to check where I'm
failing to configure Hadoop?


Thank you,
Paolo

Re: Hadoop PseudoDistributed configuration

Posted by Paolo Di Tommaso <pa...@gmail.com>.
Hi Joey,

I've just figured out that I had to conflict version of Hadoop ..

Thanks for helping.

Paolo


On Tue, Nov 8, 2011 at 1:06 PM, Joey Echeverria <jo...@cloudera.com> wrote:

> What is your setting for fs.default.name?
>
> -Joey
>
> On Nov 8, 2011, at 5:54, Paolo Di Tommaso <pa...@gmail.com>
> wrote:
>
> > Dear all,
> >
> > I'm trying to install Hadoop (0.20.2) in pseudo distributed mode to run
> > some tests on a Linux machine (Fedora 8) .
> >
> > I have followed the installation steps in the guide available here
> >
> http://hadoop.apache.org/common/docs/current/single_node_setup.html#PseudoDistributed
> >
> >
> > The daemons start with no problem, but when I access the HDFS file
> > system (hadoop
> > fs -ls /) it shows all the content of the underling (real) file system.
> > This seems really strange to me because I'm expecting that HDFS should
> work
> > as an independent file system.
> >
> >
> > Does anybody had the same problem? Any suggestions to check where I'm
> > failing to configure Hadoop?
> >
> >
> > Thank you,
> > Paolo
>

Re: Hadoop PseudoDistributed configuration

Posted by Uma Maheswara Rao G 72686 <ma...@huawei.com>.
did you configure fs.default.name with DFS address. 
you might have configured file:///.
if yes, please update dfs address hdfs://xx.xx.xx.xx:port and try.

you ned to add this in core-site.xml file.

Regards,
Uma 
----- Original Message -----
From: Joey Echeverria <jo...@cloudera.com>
Date: Tuesday, November 8, 2011 5:37 pm
Subject: Re: Hadoop PseudoDistributed configuration
To: "common-user@hadoop.apache.org" <co...@hadoop.apache.org>
Cc: "common-user@hadoop.apache.org" <co...@hadoop.apache.org>

> What is your setting for fs.default.name?
> 
> -Joey
> 
> On Nov 8, 2011, at 5:54, Paolo Di Tommaso 
> <pa...@gmail.com> wrote:
> 
> > Dear all,
> > 
> > I'm trying to install Hadoop (0.20.2) in pseudo distributed mode 
> to run
> > some tests on a Linux machine (Fedora 8) .
> > 
> > I have followed the installation steps in the guide available here
> > 
> http://hadoop.apache.org/common/docs/current/single_node_setup.html#PseudoDistributed> 
> > 
> > The daemons start with no problem, but when I access the HDFS file
> > system (hadoop
> > fs -ls /) it shows all the content of the underling (real) file 
> system.> This seems really strange to me because I'm expecting 
> that HDFS should work
> > as an independent file system.
> > 
> > 
> > Does anybody had the same problem? Any suggestions to check 
> where I'm
> > failing to configure Hadoop?
> > 
> > 
> > Thank you,
> > Paolo
> 

Re: Hadoop PseudoDistributed configuration

Posted by Joey Echeverria <jo...@cloudera.com>.
What is your setting for fs.default.name?

-Joey

On Nov 8, 2011, at 5:54, Paolo Di Tommaso <pa...@gmail.com> wrote:

> Dear all,
> 
> I'm trying to install Hadoop (0.20.2) in pseudo distributed mode to run
> some tests on a Linux machine (Fedora 8) .
> 
> I have followed the installation steps in the guide available here
> http://hadoop.apache.org/common/docs/current/single_node_setup.html#PseudoDistributed
> 
> 
> The daemons start with no problem, but when I access the HDFS file
> system (hadoop
> fs -ls /) it shows all the content of the underling (real) file system.
> This seems really strange to me because I'm expecting that HDFS should work
> as an independent file system.
> 
> 
> Does anybody had the same problem? Any suggestions to check where I'm
> failing to configure Hadoop?
> 
> 
> Thank you,
> Paolo