You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Anh Vũ Nguyễn <an...@gmail.com> on 2009/02/24 05:00:06 UTC

hdfs disappears

Hi everyone!
I am using Hadoop Core (version 0.19.0), os : Ubuntu 8.04, on one single
machine (for testing purpose). Everytime I shutdown my computer and turn on
it again, I can't access the virtual distributed file system just by command
"{$HADOOP_HOME}/bin/start-all.sh". All the data has disappeared, and I have
to reformat the file system (using {$HADOOP_HOME}/bin/hadoop namenode
-format) before "start-all.sh". Can any one explain me how to fix this
problem?
Thanks in advance.
Vu Nguyen.

Re: hdfs disappears

Posted by Anh Vũ Nguyễn <an...@gmail.com>.
Yes, your solution is true. The problem has been fixed. Thanks again. :)

On Tue, Feb 24, 2009 at 11:15 AM, Matei Zaharia <ma...@cloudera.com> wrote:

> That would be a good idea, or to have the default be within the Hadoop
> installation directory so you can't miss it (like logs is). Not sure why it
> was placed in /tmp.
>
> On Mon, Feb 23, 2009 at 8:10 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
>
> > Exactly the same thing happened to me, and Brian gave the same answer.
> What
> > if the default is changed to the user's home directory somewhere?
> >
> > On Mon, Feb 23, 2009 at 10:05 PM, Brian Bockelman <bbockelm@cse.unl.edu
> > >wrote:
> >
> > > Hello,
> > >
> > > Where are you saving your data?  If it's being written into /tmp, it
> will
> > > be deleted every time you restart your computer.  I believe writing
> into
> > > /tmp is the default for Hadoop unless you changed it in
> hadoop-site.xml.
> > >
> > > Brian
> > >
> > >
> > > On Feb 23, 2009, at 10:00 PM, Anh Vũ Nguyễn wrote:
> > >
> > >  Hi everyone!
> > >> I am using Hadoop Core (version 0.19.0), os : Ubuntu 8.04, on one
> single
> > >> machine (for testing purpose). Everytime I shutdown my computer and
> turn
> > >> on
> > >> it again, I can't access the virtual distributed file system just by
> > >> command
> > >> "{$HADOOP_HOME}/bin/start-all.sh". All the data has disappeared, and I
> > >> have
> > >> to reformat the file system (using {$HADOOP_HOME}/bin/hadoop namenode
> > >> -format) before "start-all.sh". Can any one explain me how to fix this
> > >> problem?
> > >> Thanks in advance.
> > >> Vu Nguyen.
> > >>
> > >
> > >
> >
>

Re: hdfs disappears

Posted by Matei Zaharia <ma...@cloudera.com>.
That would be a good idea, or to have the default be within the Hadoop
installation directory so you can't miss it (like logs is). Not sure why it
was placed in /tmp.

On Mon, Feb 23, 2009 at 8:10 PM, Mark Kerzner <ma...@gmail.com> wrote:

> Exactly the same thing happened to me, and Brian gave the same answer. What
> if the default is changed to the user's home directory somewhere?
>
> On Mon, Feb 23, 2009 at 10:05 PM, Brian Bockelman <bbockelm@cse.unl.edu
> >wrote:
>
> > Hello,
> >
> > Where are you saving your data?  If it's being written into /tmp, it will
> > be deleted every time you restart your computer.  I believe writing into
> > /tmp is the default for Hadoop unless you changed it in hadoop-site.xml.
> >
> > Brian
> >
> >
> > On Feb 23, 2009, at 10:00 PM, Anh Vũ Nguyễn wrote:
> >
> >  Hi everyone!
> >> I am using Hadoop Core (version 0.19.0), os : Ubuntu 8.04, on one single
> >> machine (for testing purpose). Everytime I shutdown my computer and turn
> >> on
> >> it again, I can't access the virtual distributed file system just by
> >> command
> >> "{$HADOOP_HOME}/bin/start-all.sh". All the data has disappeared, and I
> >> have
> >> to reformat the file system (using {$HADOOP_HOME}/bin/hadoop namenode
> >> -format) before "start-all.sh". Can any one explain me how to fix this
> >> problem?
> >> Thanks in advance.
> >> Vu Nguyen.
> >>
> >
> >
>

Re: hdfs disappears

Posted by Mark Kerzner <ma...@gmail.com>.
Exactly the same thing happened to me, and Brian gave the same answer. What
if the default is changed to the user's home directory somewhere?

On Mon, Feb 23, 2009 at 10:05 PM, Brian Bockelman <bb...@cse.unl.edu>wrote:

> Hello,
>
> Where are you saving your data?  If it's being written into /tmp, it will
> be deleted every time you restart your computer.  I believe writing into
> /tmp is the default for Hadoop unless you changed it in hadoop-site.xml.
>
> Brian
>
>
> On Feb 23, 2009, at 10:00 PM, Anh Vũ Nguyễn wrote:
>
>  Hi everyone!
>> I am using Hadoop Core (version 0.19.0), os : Ubuntu 8.04, on one single
>> machine (for testing purpose). Everytime I shutdown my computer and turn
>> on
>> it again, I can't access the virtual distributed file system just by
>> command
>> "{$HADOOP_HOME}/bin/start-all.sh". All the data has disappeared, and I
>> have
>> to reformat the file system (using {$HADOOP_HOME}/bin/hadoop namenode
>> -format) before "start-all.sh". Can any one explain me how to fix this
>> problem?
>> Thanks in advance.
>> Vu Nguyen.
>>
>
>

Re: hdfs disappears

Posted by Brian Bockelman <bb...@cse.unl.edu>.
Hello,

Where are you saving your data?  If it's being written into /tmp, it  
will be deleted every time you restart your computer.  I believe  
writing into /tmp is the default for Hadoop unless you changed it in  
hadoop-site.xml.

Brian

On Feb 23, 2009, at 10:00 PM, Anh Vũ Nguyễn wrote:

> Hi everyone!
> I am using Hadoop Core (version 0.19.0), os : Ubuntu 8.04, on one  
> single
> machine (for testing purpose). Everytime I shutdown my computer and  
> turn on
> it again, I can't access the virtual distributed file system just by  
> command
> "{$HADOOP_HOME}/bin/start-all.sh". All the data has disappeared, and  
> I have
> to reformat the file system (using {$HADOOP_HOME}/bin/hadoop namenode
> -format) before "start-all.sh". Can any one explain me how to fix this
> problem?
> Thanks in advance.
> Vu Nguyen.